This notebook is a template with each step that you need to complete for the project.
Please fill in your code where there are explicit ? markers in the notebook. You are welcome to add more cells and code as you see fit.
Once you have completed all the code implementations, please export your notebook as a HTML file so the reviews can view your code. Make sure you have all outputs correctly outputted.
File-> Export Notebook As... -> Export Notebook as HTML
There is a writeup to complete as well after all code implememtation is done. Please answer all questions and attach the necessary tables and charts. You can complete the writeup in either markdown or PDF.
Completing the code template and writeup template will cover all of the rubric points for this project.
The rubric contains "Stand Out Suggestions" for enhancing the project beyond the minimum requirements. The stand out suggestions are optional. If you decide to pursue the "stand out suggestions", you can include the code in this notebook and also discuss the results in the writeup file.
Below is example of steps to get the API username and key. Each student will have their own username and key.
ml.t3.medium instance (2 vCPU + 4 GiB)Python 3 (MXNet 1.8 Python 3.7 CPU Optimized)from google.colab import drive
drive.mount('/content/drive')
Mounted at /content/drive
!pip install -U pip
!pip install -U setuptools wheel
!pip install -U "mxnet<2.0.0" bokeh==2.0.1
!pip install autogluon --no-cache-dir
# Without --no-cache-dir, smaller aws instances may have trouble installing
Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/
Requirement already satisfied: pip in /usr/local/lib/python3.10/dist-packages (23.1.2)
Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/
Requirement already satisfied: setuptools in /usr/local/lib/python3.10/dist-packages (67.7.2)
Collecting setuptools
Downloading setuptools-67.8.0-py3-none-any.whl (1.1 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.1/1.1 MB 23.3 MB/s eta 0:00:00
Requirement already satisfied: wheel in /usr/local/lib/python3.10/dist-packages (0.40.0)
Installing collected packages: setuptools
Attempting uninstall: setuptools
Found existing installation: setuptools 67.7.2
Uninstalling setuptools-67.7.2:
Successfully uninstalled setuptools-67.7.2
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
ipython 7.34.0 requires jedi>=0.16, which is not installed.
Successfully installed setuptools-67.8.0
Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/
Collecting mxnet<2.0.0
Downloading mxnet-1.9.1-py3-none-manylinux2014_x86_64.whl (49.1 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 49.1/49.1 MB 15.7 MB/s eta 0:00:00
Collecting bokeh==2.0.1
Downloading bokeh-2.0.1.tar.gz (8.6 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 8.6/8.6 MB 110.9 MB/s eta 0:00:00
Preparing metadata (setup.py) ... done
Requirement already satisfied: PyYAML>=3.10 in /usr/local/lib/python3.10/dist-packages (from bokeh==2.0.1) (6.0)
Requirement already satisfied: python-dateutil>=2.1 in /usr/local/lib/python3.10/dist-packages (from bokeh==2.0.1) (2.8.2)
Requirement already satisfied: Jinja2>=2.7 in /usr/local/lib/python3.10/dist-packages (from bokeh==2.0.1) (3.1.2)
Requirement already satisfied: numpy>=1.11.3 in /usr/local/lib/python3.10/dist-packages (from bokeh==2.0.1) (1.22.4)
Requirement already satisfied: pillow>=4.0 in /usr/local/lib/python3.10/dist-packages (from bokeh==2.0.1) (8.4.0)
Requirement already satisfied: packaging>=16.8 in /usr/local/lib/python3.10/dist-packages (from bokeh==2.0.1) (23.1)
Requirement already satisfied: tornado>=5 in /usr/local/lib/python3.10/dist-packages (from bokeh==2.0.1) (6.3.1)
Requirement already satisfied: typing_extensions>=3.7.4 in /usr/local/lib/python3.10/dist-packages (from bokeh==2.0.1) (4.5.0)
Requirement already satisfied: requests<3,>=2.20.0 in /usr/local/lib/python3.10/dist-packages (from mxnet<2.0.0) (2.27.1)
Collecting graphviz<0.9.0,>=0.8.1 (from mxnet<2.0.0)
Downloading graphviz-0.8.4-py2.py3-none-any.whl (16 kB)
Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.10/dist-packages (from Jinja2>=2.7->bokeh==2.0.1) (2.1.2)
Requirement already satisfied: six>=1.5 in /usr/local/lib/python3.10/dist-packages (from python-dateutil>=2.1->bokeh==2.0.1) (1.16.0)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /usr/local/lib/python3.10/dist-packages (from requests<3,>=2.20.0->mxnet<2.0.0) (1.26.15)
Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.10/dist-packages (from requests<3,>=2.20.0->mxnet<2.0.0) (2022.12.7)
Requirement already satisfied: charset-normalizer~=2.0.0 in /usr/local/lib/python3.10/dist-packages (from requests<3,>=2.20.0->mxnet<2.0.0) (2.0.12)
Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.10/dist-packages (from requests<3,>=2.20.0->mxnet<2.0.0) (3.4)
Building wheels for collected packages: bokeh
Building wheel for bokeh (setup.py) ... done
Created wheel for bokeh: filename=bokeh-2.0.1-py3-none-any.whl size=9080019 sha256=baa6c1a6eeab7d284d26baae66ee7f0971c9932d88bef53b0afb48d321914143
Stored in directory: /root/.cache/pip/wheels/be/b4/d8/7ce778fd6e637bea03a561223a77ba6649aff8168e3c613754
Successfully built bokeh
Installing collected packages: graphviz, mxnet, bokeh
Attempting uninstall: graphviz
Found existing installation: graphviz 0.20.1
Uninstalling graphviz-0.20.1:
Successfully uninstalled graphviz-0.20.1
Attempting uninstall: bokeh
Found existing installation: bokeh 2.4.3
Uninstalling bokeh-2.4.3:
Successfully uninstalled bokeh-2.4.3
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
panel 0.14.4 requires bokeh<2.5.0,>=2.4.0, but you have bokeh 2.0.1 which is incompatible.
Successfully installed bokeh-2.0.1 graphviz-0.8.4 mxnet-1.9.1
Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/
Collecting autogluon
Downloading autogluon-0.7.0-py3-none-any.whl (9.7 kB)
Collecting autogluon.core[all]==0.7.0 (from autogluon)
Downloading autogluon.core-0.7.0-py3-none-any.whl (218 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 218.3/218.3 kB 10.0 MB/s eta 0:00:00
Collecting autogluon.features==0.7.0 (from autogluon)
Downloading autogluon.features-0.7.0-py3-none-any.whl (60 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 60.1/60.1 kB 219.8 MB/s eta 0:00:00
Collecting autogluon.tabular[all]==0.7.0 (from autogluon)
Downloading autogluon.tabular-0.7.0-py3-none-any.whl (292 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 292.2/292.2 kB 144.4 MB/s eta 0:00:00
Collecting autogluon.multimodal==0.7.0 (from autogluon)
Downloading autogluon.multimodal-0.7.0-py3-none-any.whl (331 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 331.1/331.1 kB 233.0 MB/s eta 0:00:00
Collecting autogluon.timeseries[all]==0.7.0 (from autogluon)
Downloading autogluon.timeseries-0.7.0-py3-none-any.whl (108 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 108.7/108.7 kB 210.4 MB/s eta 0:00:00
Requirement already satisfied: numpy<1.27,>=1.21 in /usr/local/lib/python3.10/dist-packages (from autogluon.core[all]==0.7.0->autogluon) (1.22.4)
Requirement already satisfied: scipy<1.12,>=1.5.4 in /usr/local/lib/python3.10/dist-packages (from autogluon.core[all]==0.7.0->autogluon) (1.10.1)
Requirement already satisfied: scikit-learn<1.3,>=1.0 in /usr/local/lib/python3.10/dist-packages (from autogluon.core[all]==0.7.0->autogluon) (1.2.2)
Collecting networkx<3.0,>=2.3 (from autogluon.core[all]==0.7.0->autogluon)
Downloading networkx-2.8.8-py3-none-any.whl (2.0 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.0/2.0 MB 197.6 MB/s eta 0:00:00
Requirement already satisfied: pandas<1.6,>=1.4.1 in /usr/local/lib/python3.10/dist-packages (from autogluon.core[all]==0.7.0->autogluon) (1.5.3)
Requirement already satisfied: tqdm<5,>=4.38 in /usr/local/lib/python3.10/dist-packages (from autogluon.core[all]==0.7.0->autogluon) (4.65.0)
Requirement already satisfied: requests in /usr/local/lib/python3.10/dist-packages (from autogluon.core[all]==0.7.0->autogluon) (2.27.1)
Requirement already satisfied: matplotlib in /usr/local/lib/python3.10/dist-packages (from autogluon.core[all]==0.7.0->autogluon) (3.7.1)
Collecting boto3<2,>=1.10 (from autogluon.core[all]==0.7.0->autogluon)
Downloading boto3-1.26.141-py3-none-any.whl (135 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 135.6/135.6 kB 339.6 MB/s eta 0:00:00
Collecting autogluon.common==0.7.0 (from autogluon.core[all]==0.7.0->autogluon)
Downloading autogluon.common-0.7.0-py3-none-any.whl (45 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 45.0/45.0 kB 254.2 MB/s eta 0:00:00
Requirement already satisfied: hyperopt<0.2.8,>=0.2.7 in /usr/local/lib/python3.10/dist-packages (from autogluon.core[all]==0.7.0->autogluon) (0.2.7)
Collecting ray[tune]<2.3,>=2.2 (from autogluon.core[all]==0.7.0->autogluon)
Downloading ray-2.2.0-cp310-cp310-manylinux2014_x86_64.whl (57.4 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 57.4/57.4 MB 248.1 MB/s eta 0:00:00
Collecting Pillow<9.6,>=9.3 (from autogluon.multimodal==0.7.0->autogluon)
Downloading Pillow-9.5.0-cp310-cp310-manylinux_2_28_x86_64.whl (3.4 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 3.4/3.4 MB 344.3 MB/s eta 0:00:00
Collecting jsonschema<4.18,>=4.14 (from autogluon.multimodal==0.7.0->autogluon)
Downloading jsonschema-4.17.3-py3-none-any.whl (90 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 90.4/90.4 kB 358.2 MB/s eta 0:00:00
Collecting seqeval<1.3.0,>=1.2.2 (from autogluon.multimodal==0.7.0->autogluon)
Downloading seqeval-1.2.2.tar.gz (43 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 43.6/43.6 kB 298.7 MB/s eta 0:00:00
Preparing metadata (setup.py) ... done
Collecting evaluate<0.4.0,>=0.2.2 (from autogluon.multimodal==0.7.0->autogluon)
Downloading evaluate-0.3.0-py3-none-any.whl (72 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 72.9/72.9 kB 199.5 MB/s eta 0:00:00
Collecting accelerate<0.17,>=0.9 (from autogluon.multimodal==0.7.0->autogluon)
Downloading accelerate-0.16.0-py3-none-any.whl (199 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 199.7/199.7 kB 384.5 MB/s eta 0:00:00
Collecting timm<0.7.0,>=0.6.12 (from autogluon.multimodal==0.7.0->autogluon)
Downloading timm-0.6.13-py3-none-any.whl (549 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 549.1/549.1 kB 346.6 MB/s eta 0:00:00
Collecting torch<1.14,>=1.9 (from autogluon.multimodal==0.7.0->autogluon)
Downloading torch-1.13.1-cp310-cp310-manylinux1_x86_64.whl (887.5 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 887.5/887.5 MB 196.2 MB/s eta 0:00:00
Collecting torchvision<0.15.0 (from autogluon.multimodal==0.7.0->autogluon)
Downloading torchvision-0.14.1-cp310-cp310-manylinux1_x86_64.whl (24.2 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 24.2/24.2 MB 257.1 MB/s eta 0:00:00
Collecting fairscale<0.4.14,>=0.4.5 (from autogluon.multimodal==0.7.0->autogluon)
Downloading fairscale-0.4.13.tar.gz (266 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 266.3/266.3 kB 387.1 MB/s eta 0:00:00
Installing build dependencies ... done
Getting requirements to build wheel ... done
Installing backend dependencies ... done
Preparing metadata (pyproject.toml) ... done
Requirement already satisfied: scikit-image<0.20.0,>=0.19.1 in /usr/local/lib/python3.10/dist-packages (from autogluon.multimodal==0.7.0->autogluon) (0.19.3)
Collecting pytorch-lightning<1.10.0,>=1.9.0 (from autogluon.multimodal==0.7.0->autogluon)
Downloading pytorch_lightning-1.9.5-py3-none-any.whl (829 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 829.5/829.5 kB 344.4 MB/s eta 0:00:00
Requirement already satisfied: text-unidecode<1.4,>=1.3 in /usr/local/lib/python3.10/dist-packages (from autogluon.multimodal==0.7.0->autogluon) (1.3)
Collecting torchmetrics<0.9.0,>=0.8.0 (from autogluon.multimodal==0.7.0->autogluon)
Downloading torchmetrics-0.8.2-py3-none-any.whl (409 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 409.8/409.8 kB 251.4 MB/s eta 0:00:00
Collecting transformers<4.27.0,>=4.23.0 (from autogluon.multimodal==0.7.0->autogluon)
Downloading transformers-4.26.1-py3-none-any.whl (6.3 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.3/6.3 MB 196.8 MB/s eta 0:00:00
Collecting nptyping<2.5.0,>=1.4.4 (from autogluon.multimodal==0.7.0->autogluon)
Downloading nptyping-2.4.1-py3-none-any.whl (36 kB)
Collecting omegaconf<2.3.0,>=2.1.1 (from autogluon.multimodal==0.7.0->autogluon)
Downloading omegaconf-2.2.3-py3-none-any.whl (79 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 79.3/79.3 kB 215.4 MB/s eta 0:00:00
Collecting sentencepiece<0.2.0,>=0.1.95 (from autogluon.multimodal==0.7.0->autogluon)
Downloading sentencepiece-0.1.99-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.3 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.3/1.3 MB 400.0 MB/s eta 0:00:00
Collecting pytorch-metric-learning<2.0,>=1.3.0 (from autogluon.multimodal==0.7.0->autogluon)
Downloading pytorch_metric_learning-1.7.3-py3-none-any.whl (112 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 112.2/112.2 kB 228.4 MB/s eta 0:00:00
Collecting nlpaug<1.2.0,>=1.1.10 (from autogluon.multimodal==0.7.0->autogluon)
Downloading nlpaug-1.1.11-py3-none-any.whl (410 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 410.5/410.5 kB 407.3 MB/s eta 0:00:00
Requirement already satisfied: nltk<4.0.0,>=3.4.5 in /usr/local/lib/python3.10/dist-packages (from autogluon.multimodal==0.7.0->autogluon) (3.8.1)
Collecting openmim<0.4.0,>0.1.5 (from autogluon.multimodal==0.7.0->autogluon)
Downloading openmim-0.3.7-py2.py3-none-any.whl (51 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 51.3/51.3 kB 221.6 MB/s eta 0:00:00
Requirement already satisfied: defusedxml<0.7.2,>=0.7.1 in /usr/local/lib/python3.10/dist-packages (from autogluon.multimodal==0.7.0->autogluon) (0.7.1)
Requirement already satisfied: jinja2<3.2,>=3.0.3 in /usr/local/lib/python3.10/dist-packages (from autogluon.multimodal==0.7.0->autogluon) (3.1.2)
Requirement already satisfied: tensorboard<3,>=2.9 in /usr/local/lib/python3.10/dist-packages (from autogluon.multimodal==0.7.0->autogluon) (2.12.2)
Collecting pytesseract<0.3.11,>=0.3.9 (from autogluon.multimodal==0.7.0->autogluon)
Downloading pytesseract-0.3.10-py3-none-any.whl (14 kB)
Collecting catboost<1.2,>=1.0 (from autogluon.tabular[all]==0.7.0->autogluon)
Downloading catboost-1.1.1-cp310-none-manylinux1_x86_64.whl (76.6 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 76.6/76.6 MB 196.4 MB/s eta 0:00:00
Requirement already satisfied: lightgbm<3.4,>=3.3 in /usr/local/lib/python3.10/dist-packages (from autogluon.tabular[all]==0.7.0->autogluon) (3.3.5)
Requirement already satisfied: xgboost<1.8,>=1.6 in /usr/local/lib/python3.10/dist-packages (from autogluon.tabular[all]==0.7.0->autogluon) (1.7.5)
Requirement already satisfied: fastai<2.8,>=2.3.1 in /usr/local/lib/python3.10/dist-packages (from autogluon.tabular[all]==0.7.0->autogluon) (2.7.12)
Requirement already satisfied: joblib<2,>=1.1 in /usr/local/lib/python3.10/dist-packages (from autogluon.timeseries[all]==0.7.0->autogluon) (1.2.0)
Requirement already satisfied: statsmodels<0.14,>=0.13.0 in /usr/local/lib/python3.10/dist-packages (from autogluon.timeseries[all]==0.7.0->autogluon) (0.13.5)
Collecting gluonts<0.13,>=0.12.0 (from autogluon.timeseries[all]==0.7.0->autogluon)
Downloading gluonts-0.12.8-py3-none-any.whl (1.2 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.2/1.2 MB 408.3 MB/s eta 0:00:00
Collecting statsforecast<1.5,>=1.4.0 (from autogluon.timeseries[all]==0.7.0->autogluon)
Downloading statsforecast-1.4.0-py3-none-any.whl (91 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 92.0/92.0 kB 327.6 MB/s eta 0:00:00
Collecting ujson<6,>=5 (from autogluon.timeseries[all]==0.7.0->autogluon)
Downloading ujson-5.7.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (52 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 52.8/52.8 kB 279.2 MB/s eta 0:00:00
Collecting sktime<0.16,>=0.14 (from autogluon.timeseries[all]==0.7.0->autogluon)
Downloading sktime-0.15.1-py3-none-any.whl (16.0 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 16.0/16.0 MB 169.6 MB/s eta 0:00:00
Collecting tbats<2,>=1.1 (from autogluon.timeseries[all]==0.7.0->autogluon)
Downloading tbats-1.1.3-py3-none-any.whl (44 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 44.0/44.0 kB 290.3 MB/s eta 0:00:00
Collecting pmdarima<1.9,>=1.8.2 (from autogluon.timeseries[all]==0.7.0->autogluon)
Downloading pmdarima-1.8.5-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl (1.4 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.4/1.4 MB 398.6 MB/s eta 0:00:00
Requirement already satisfied: psutil<6,>=5.7.3 in /usr/local/lib/python3.10/dist-packages (from autogluon.common==0.7.0->autogluon.core[all]==0.7.0->autogluon) (5.9.5)
Requirement already satisfied: setuptools in /usr/local/lib/python3.10/dist-packages (from autogluon.common==0.7.0->autogluon.core[all]==0.7.0->autogluon) (67.8.0)
Requirement already satisfied: packaging>=20.0 in /usr/local/lib/python3.10/dist-packages (from accelerate<0.17,>=0.9->autogluon.multimodal==0.7.0->autogluon) (23.1)
Requirement already satisfied: pyyaml in /usr/local/lib/python3.10/dist-packages (from accelerate<0.17,>=0.9->autogluon.multimodal==0.7.0->autogluon) (6.0)
Collecting botocore<1.30.0,>=1.29.141 (from boto3<2,>=1.10->autogluon.core[all]==0.7.0->autogluon)
Downloading botocore-1.29.141-py3-none-any.whl (10.8 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 10.8/10.8 MB 212.9 MB/s eta 0:00:00
Collecting jmespath<2.0.0,>=0.7.1 (from boto3<2,>=1.10->autogluon.core[all]==0.7.0->autogluon)
Downloading jmespath-1.0.1-py3-none-any.whl (20 kB)
Collecting s3transfer<0.7.0,>=0.6.0 (from boto3<2,>=1.10->autogluon.core[all]==0.7.0->autogluon)
Downloading s3transfer-0.6.1-py3-none-any.whl (79 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 79.8/79.8 kB 329.5 MB/s eta 0:00:00
Requirement already satisfied: graphviz in /usr/local/lib/python3.10/dist-packages (from catboost<1.2,>=1.0->autogluon.tabular[all]==0.7.0->autogluon) (0.8.4)
Requirement already satisfied: plotly in /usr/local/lib/python3.10/dist-packages (from catboost<1.2,>=1.0->autogluon.tabular[all]==0.7.0->autogluon) (5.13.1)
Requirement already satisfied: six in /usr/local/lib/python3.10/dist-packages (from catboost<1.2,>=1.0->autogluon.tabular[all]==0.7.0->autogluon) (1.16.0)
Collecting datasets>=2.0.0 (from evaluate<0.4.0,>=0.2.2->autogluon.multimodal==0.7.0->autogluon)
Downloading datasets-2.12.0-py3-none-any.whl (474 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 474.6/474.6 kB 252.6 MB/s eta 0:00:00
Collecting dill (from evaluate<0.4.0,>=0.2.2->autogluon.multimodal==0.7.0->autogluon)
Downloading dill-0.3.6-py3-none-any.whl (110 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 110.5/110.5 kB 310.4 MB/s eta 0:00:00
Collecting xxhash (from evaluate<0.4.0,>=0.2.2->autogluon.multimodal==0.7.0->autogluon)
Downloading xxhash-3.2.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (212 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 212.5/212.5 kB 373.9 MB/s eta 0:00:00
Collecting multiprocess (from evaluate<0.4.0,>=0.2.2->autogluon.multimodal==0.7.0->autogluon)
Downloading multiprocess-0.70.14-py310-none-any.whl (134 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 134.3/134.3 kB 373.7 MB/s eta 0:00:00
Requirement already satisfied: fsspec[http]>=2021.05.0 in /usr/local/lib/python3.10/dist-packages (from evaluate<0.4.0,>=0.2.2->autogluon.multimodal==0.7.0->autogluon) (2023.4.0)
Collecting huggingface-hub>=0.7.0 (from evaluate<0.4.0,>=0.2.2->autogluon.multimodal==0.7.0->autogluon)
Downloading huggingface_hub-0.14.1-py3-none-any.whl (224 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 224.5/224.5 kB 415.2 MB/s eta 0:00:00
Collecting responses<0.19 (from evaluate<0.4.0,>=0.2.2->autogluon.multimodal==0.7.0->autogluon)
Downloading responses-0.18.0-py3-none-any.whl (38 kB)
Requirement already satisfied: pip in /usr/local/lib/python3.10/dist-packages (from fastai<2.8,>=2.3.1->autogluon.tabular[all]==0.7.0->autogluon) (23.1.2)
Requirement already satisfied: fastdownload<2,>=0.0.5 in /usr/local/lib/python3.10/dist-packages (from fastai<2.8,>=2.3.1->autogluon.tabular[all]==0.7.0->autogluon) (0.0.7)
Requirement already satisfied: fastcore<1.6,>=1.5.29 in /usr/local/lib/python3.10/dist-packages (from fastai<2.8,>=2.3.1->autogluon.tabular[all]==0.7.0->autogluon) (1.5.29)
Requirement already satisfied: fastprogress>=0.2.4 in /usr/local/lib/python3.10/dist-packages (from fastai<2.8,>=2.3.1->autogluon.tabular[all]==0.7.0->autogluon) (1.0.3)
Requirement already satisfied: spacy<4 in /usr/local/lib/python3.10/dist-packages (from fastai<2.8,>=2.3.1->autogluon.tabular[all]==0.7.0->autogluon) (3.5.2)
Requirement already satisfied: pydantic~=1.7 in /usr/local/lib/python3.10/dist-packages (from gluonts<0.13,>=0.12.0->autogluon.timeseries[all]==0.7.0->autogluon) (1.10.7)
Requirement already satisfied: toolz~=0.10 in /usr/local/lib/python3.10/dist-packages (from gluonts<0.13,>=0.12.0->autogluon.timeseries[all]==0.7.0->autogluon) (0.12.0)
Requirement already satisfied: typing-extensions~=4.0 in /usr/local/lib/python3.10/dist-packages (from gluonts<0.13,>=0.12.0->autogluon.timeseries[all]==0.7.0->autogluon) (4.5.0)
Requirement already satisfied: future in /usr/local/lib/python3.10/dist-packages (from hyperopt<0.2.8,>=0.2.7->autogluon.core[all]==0.7.0->autogluon) (0.18.3)
Requirement already satisfied: cloudpickle in /usr/local/lib/python3.10/dist-packages (from hyperopt<0.2.8,>=0.2.7->autogluon.core[all]==0.7.0->autogluon) (2.2.1)
Requirement already satisfied: py4j in /usr/local/lib/python3.10/dist-packages (from hyperopt<0.2.8,>=0.2.7->autogluon.core[all]==0.7.0->autogluon) (0.10.9.7)
Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.10/dist-packages (from jinja2<3.2,>=3.0.3->autogluon.multimodal==0.7.0->autogluon) (2.1.2)
Requirement already satisfied: attrs>=17.4.0 in /usr/local/lib/python3.10/dist-packages (from jsonschema<4.18,>=4.14->autogluon.multimodal==0.7.0->autogluon) (23.1.0)
Requirement already satisfied: pyrsistent!=0.17.0,!=0.17.1,!=0.17.2,>=0.14.0 in /usr/local/lib/python3.10/dist-packages (from jsonschema<4.18,>=4.14->autogluon.multimodal==0.7.0->autogluon) (0.19.3)
Requirement already satisfied: wheel in /usr/local/lib/python3.10/dist-packages (from lightgbm<3.4,>=3.3->autogluon.tabular[all]==0.7.0->autogluon) (0.40.0)
Requirement already satisfied: gdown>=4.0.0 in /usr/local/lib/python3.10/dist-packages (from nlpaug<1.2.0,>=1.1.10->autogluon.multimodal==0.7.0->autogluon) (4.6.6)
Requirement already satisfied: click in /usr/local/lib/python3.10/dist-packages (from nltk<4.0.0,>=3.4.5->autogluon.multimodal==0.7.0->autogluon) (8.1.3)
Requirement already satisfied: regex>=2021.8.3 in /usr/local/lib/python3.10/dist-packages (from nltk<4.0.0,>=3.4.5->autogluon.multimodal==0.7.0->autogluon) (2022.10.31)
Collecting antlr4-python3-runtime==4.9.* (from omegaconf<2.3.0,>=2.1.1->autogluon.multimodal==0.7.0->autogluon)
Downloading antlr4-python3-runtime-4.9.3.tar.gz (117 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 117.0/117.0 kB 366.1 MB/s eta 0:00:00
Preparing metadata (setup.py) ... done
Collecting colorama (from openmim<0.4.0,>0.1.5->autogluon.multimodal==0.7.0->autogluon)
Downloading colorama-0.4.6-py2.py3-none-any.whl (25 kB)
Collecting model-index (from openmim<0.4.0,>0.1.5->autogluon.multimodal==0.7.0->autogluon)
Downloading model_index-0.1.11-py3-none-any.whl (34 kB)
Requirement already satisfied: rich in /usr/local/lib/python3.10/dist-packages (from openmim<0.4.0,>0.1.5->autogluon.multimodal==0.7.0->autogluon) (13.3.4)
Requirement already satisfied: tabulate in /usr/local/lib/python3.10/dist-packages (from openmim<0.4.0,>0.1.5->autogluon.multimodal==0.7.0->autogluon) (0.8.10)
Requirement already satisfied: python-dateutil>=2.8.1 in /usr/local/lib/python3.10/dist-packages (from pandas<1.6,>=1.4.1->autogluon.core[all]==0.7.0->autogluon) (2.8.2)
Requirement already satisfied: pytz>=2020.1 in /usr/local/lib/python3.10/dist-packages (from pandas<1.6,>=1.4.1->autogluon.core[all]==0.7.0->autogluon) (2022.7.1)
Requirement already satisfied: Cython!=0.29.18,>=0.29 in /usr/local/lib/python3.10/dist-packages (from pmdarima<1.9,>=1.8.2->autogluon.timeseries[all]==0.7.0->autogluon) (0.29.34)
Requirement already satisfied: urllib3 in /usr/local/lib/python3.10/dist-packages (from pmdarima<1.9,>=1.8.2->autogluon.timeseries[all]==0.7.0->autogluon) (1.26.15)
Collecting lightning-utilities>=0.6.0.post0 (from pytorch-lightning<1.10.0,>=1.9.0->autogluon.multimodal==0.7.0->autogluon)
Downloading lightning_utilities-0.8.0-py3-none-any.whl (20 kB)
Requirement already satisfied: filelock in /usr/local/lib/python3.10/dist-packages (from ray[tune]<2.3,>=2.2->autogluon.core[all]==0.7.0->autogluon) (3.12.0)
Requirement already satisfied: msgpack<2.0.0,>=1.0.0 in /usr/local/lib/python3.10/dist-packages (from ray[tune]<2.3,>=2.2->autogluon.core[all]==0.7.0->autogluon) (1.0.5)
Requirement already satisfied: protobuf!=3.19.5,>=3.15.3 in /usr/local/lib/python3.10/dist-packages (from ray[tune]<2.3,>=2.2->autogluon.core[all]==0.7.0->autogluon) (3.20.3)
Collecting aiosignal (from ray[tune]<2.3,>=2.2->autogluon.core[all]==0.7.0->autogluon)
Downloading aiosignal-1.3.1-py3-none-any.whl (7.6 kB)
Collecting frozenlist (from ray[tune]<2.3,>=2.2->autogluon.core[all]==0.7.0->autogluon)
Downloading frozenlist-1.3.3-cp310-cp310-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl (149 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 149.6/149.6 kB 314.5 MB/s eta 0:00:00
Collecting virtualenv>=20.0.24 (from ray[tune]<2.3,>=2.2->autogluon.core[all]==0.7.0->autogluon)
Downloading virtualenv-20.23.0-py3-none-any.whl (3.3 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 3.3/3.3 MB 405.8 MB/s eta 0:00:00
Requirement already satisfied: grpcio>=1.42.0 in /usr/local/lib/python3.10/dist-packages (from ray[tune]<2.3,>=2.2->autogluon.core[all]==0.7.0->autogluon) (1.54.0)
Collecting tensorboardX>=1.9 (from ray[tune]<2.3,>=2.2->autogluon.core[all]==0.7.0->autogluon)
Downloading tensorboardX-2.6-py2.py3-none-any.whl (114 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 114.5/114.5 kB 328.0 MB/s eta 0:00:00
Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.10/dist-packages (from requests->autogluon.core[all]==0.7.0->autogluon) (2022.12.7)
Requirement already satisfied: charset-normalizer~=2.0.0 in /usr/local/lib/python3.10/dist-packages (from requests->autogluon.core[all]==0.7.0->autogluon) (2.0.12)
Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.10/dist-packages (from requests->autogluon.core[all]==0.7.0->autogluon) (3.4)
Requirement already satisfied: imageio>=2.4.1 in /usr/local/lib/python3.10/dist-packages (from scikit-image<0.20.0,>=0.19.1->autogluon.multimodal==0.7.0->autogluon) (2.25.1)
Requirement already satisfied: tifffile>=2019.7.26 in /usr/local/lib/python3.10/dist-packages (from scikit-image<0.20.0,>=0.19.1->autogluon.multimodal==0.7.0->autogluon) (2023.4.12)
Requirement already satisfied: PyWavelets>=1.1.1 in /usr/local/lib/python3.10/dist-packages (from scikit-image<0.20.0,>=0.19.1->autogluon.multimodal==0.7.0->autogluon) (1.4.1)
Requirement already satisfied: threadpoolctl>=2.0.0 in /usr/local/lib/python3.10/dist-packages (from scikit-learn<1.3,>=1.0->autogluon.core[all]==0.7.0->autogluon) (3.1.0)
Collecting deprecated>=1.2.13 (from sktime<0.16,>=0.14->autogluon.timeseries[all]==0.7.0->autogluon)
Downloading Deprecated-1.2.13-py2.py3-none-any.whl (9.6 kB)
Requirement already satisfied: numba>=0.55 in /usr/local/lib/python3.10/dist-packages (from sktime<0.16,>=0.14->autogluon.timeseries[all]==0.7.0->autogluon) (0.56.4)
Requirement already satisfied: patsy>=0.5.2 in /usr/local/lib/python3.10/dist-packages (from statsmodels<0.14,>=0.13.0->autogluon.timeseries[all]==0.7.0->autogluon) (0.5.3)
Requirement already satisfied: absl-py>=0.4 in /usr/local/lib/python3.10/dist-packages (from tensorboard<3,>=2.9->autogluon.multimodal==0.7.0->autogluon) (1.4.0)
Requirement already satisfied: google-auth<3,>=1.6.3 in /usr/local/lib/python3.10/dist-packages (from tensorboard<3,>=2.9->autogluon.multimodal==0.7.0->autogluon) (2.17.3)
Requirement already satisfied: google-auth-oauthlib<1.1,>=0.5 in /usr/local/lib/python3.10/dist-packages (from tensorboard<3,>=2.9->autogluon.multimodal==0.7.0->autogluon) (1.0.0)
Requirement already satisfied: markdown>=2.6.8 in /usr/local/lib/python3.10/dist-packages (from tensorboard<3,>=2.9->autogluon.multimodal==0.7.0->autogluon) (3.4.3)
Requirement already satisfied: tensorboard-data-server<0.8.0,>=0.7.0 in /usr/local/lib/python3.10/dist-packages (from tensorboard<3,>=2.9->autogluon.multimodal==0.7.0->autogluon) (0.7.0)
Requirement already satisfied: tensorboard-plugin-wit>=1.6.0 in /usr/local/lib/python3.10/dist-packages (from tensorboard<3,>=2.9->autogluon.multimodal==0.7.0->autogluon) (1.8.1)
Requirement already satisfied: werkzeug>=1.0.1 in /usr/local/lib/python3.10/dist-packages (from tensorboard<3,>=2.9->autogluon.multimodal==0.7.0->autogluon) (2.3.0)
Collecting nvidia-cuda-runtime-cu11==11.7.99 (from torch<1.14,>=1.9->autogluon.multimodal==0.7.0->autogluon)
Downloading nvidia_cuda_runtime_cu11-11.7.99-py3-none-manylinux1_x86_64.whl (849 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 849.3/849.3 kB 406.4 MB/s eta 0:00:00
Collecting nvidia-cudnn-cu11==8.5.0.96 (from torch<1.14,>=1.9->autogluon.multimodal==0.7.0->autogluon)
Downloading nvidia_cudnn_cu11-8.5.0.96-2-py3-none-manylinux1_x86_64.whl (557.1 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 557.1/557.1 MB 13.7 MB/s eta 0:00:00
Collecting nvidia-cublas-cu11==11.10.3.66 (from torch<1.14,>=1.9->autogluon.multimodal==0.7.0->autogluon)
Downloading nvidia_cublas_cu11-11.10.3.66-py3-none-manylinux1_x86_64.whl (317.1 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 317.1/317.1 MB 259.8 MB/s eta 0:00:00
Collecting nvidia-cuda-nvrtc-cu11==11.7.99 (from torch<1.14,>=1.9->autogluon.multimodal==0.7.0->autogluon)
Downloading nvidia_cuda_nvrtc_cu11-11.7.99-2-py3-none-manylinux1_x86_64.whl (21.0 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 21.0/21.0 MB 295.1 MB/s eta 0:00:00
Collecting pyDeprecate==0.3.* (from torchmetrics<0.9.0,>=0.8.0->autogluon.multimodal==0.7.0->autogluon)
Downloading pyDeprecate-0.3.2-py3-none-any.whl (10 kB)
Collecting tokenizers!=0.11.3,<0.14,>=0.11.1 (from transformers<4.27.0,>=4.23.0->autogluon.multimodal==0.7.0->autogluon)
Downloading tokenizers-0.13.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (7.8 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7.8/7.8 MB 331.6 MB/s eta 0:00:00
Requirement already satisfied: contourpy>=1.0.1 in /usr/local/lib/python3.10/dist-packages (from matplotlib->autogluon.core[all]==0.7.0->autogluon) (1.0.7)
Requirement already satisfied: cycler>=0.10 in /usr/local/lib/python3.10/dist-packages (from matplotlib->autogluon.core[all]==0.7.0->autogluon) (0.11.0)
Requirement already satisfied: fonttools>=4.22.0 in /usr/local/lib/python3.10/dist-packages (from matplotlib->autogluon.core[all]==0.7.0->autogluon) (4.39.3)
Requirement already satisfied: kiwisolver>=1.0.1 in /usr/local/lib/python3.10/dist-packages (from matplotlib->autogluon.core[all]==0.7.0->autogluon) (1.4.4)
Requirement already satisfied: pyparsing>=2.3.1 in /usr/local/lib/python3.10/dist-packages (from matplotlib->autogluon.core[all]==0.7.0->autogluon) (3.0.9)
Requirement already satisfied: pyarrow>=8.0.0 in /usr/local/lib/python3.10/dist-packages (from datasets>=2.0.0->evaluate<0.4.0,>=0.2.2->autogluon.multimodal==0.7.0->autogluon) (9.0.0)
Collecting aiohttp (from datasets>=2.0.0->evaluate<0.4.0,>=0.2.2->autogluon.multimodal==0.7.0->autogluon)
Downloading aiohttp-3.8.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.0 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.0/1.0 MB 387.4 MB/s eta 0:00:00
Requirement already satisfied: wrapt<2,>=1.10 in /usr/local/lib/python3.10/dist-packages (from deprecated>=1.2.13->sktime<0.16,>=0.14->autogluon.timeseries[all]==0.7.0->autogluon) (1.14.1)
Requirement already satisfied: beautifulsoup4 in /usr/local/lib/python3.10/dist-packages (from gdown>=4.0.0->nlpaug<1.2.0,>=1.1.10->autogluon.multimodal==0.7.0->autogluon) (4.11.2)
Requirement already satisfied: cachetools<6.0,>=2.0.0 in /usr/local/lib/python3.10/dist-packages (from google-auth<3,>=1.6.3->tensorboard<3,>=2.9->autogluon.multimodal==0.7.0->autogluon) (5.3.0)
Requirement already satisfied: pyasn1-modules>=0.2.1 in /usr/local/lib/python3.10/dist-packages (from google-auth<3,>=1.6.3->tensorboard<3,>=2.9->autogluon.multimodal==0.7.0->autogluon) (0.3.0)
Requirement already satisfied: rsa<5,>=3.1.4 in /usr/local/lib/python3.10/dist-packages (from google-auth<3,>=1.6.3->tensorboard<3,>=2.9->autogluon.multimodal==0.7.0->autogluon) (4.9)
Requirement already satisfied: requests-oauthlib>=0.7.0 in /usr/local/lib/python3.10/dist-packages (from google-auth-oauthlib<1.1,>=0.5->tensorboard<3,>=2.9->autogluon.multimodal==0.7.0->autogluon) (1.3.1)
Requirement already satisfied: llvmlite<0.40,>=0.39.0dev0 in /usr/local/lib/python3.10/dist-packages (from numba>=0.55->sktime<0.16,>=0.14->autogluon.timeseries[all]==0.7.0->autogluon) (0.39.1)
Requirement already satisfied: spacy-legacy<3.1.0,>=3.0.11 in /usr/local/lib/python3.10/dist-packages (from spacy<4->fastai<2.8,>=2.3.1->autogluon.tabular[all]==0.7.0->autogluon) (3.0.12)
Requirement already satisfied: spacy-loggers<2.0.0,>=1.0.0 in /usr/local/lib/python3.10/dist-packages (from spacy<4->fastai<2.8,>=2.3.1->autogluon.tabular[all]==0.7.0->autogluon) (1.0.4)
Requirement already satisfied: murmurhash<1.1.0,>=0.28.0 in /usr/local/lib/python3.10/dist-packages (from spacy<4->fastai<2.8,>=2.3.1->autogluon.tabular[all]==0.7.0->autogluon) (1.0.9)
Requirement already satisfied: cymem<2.1.0,>=2.0.2 in /usr/local/lib/python3.10/dist-packages (from spacy<4->fastai<2.8,>=2.3.1->autogluon.tabular[all]==0.7.0->autogluon) (2.0.7)
Requirement already satisfied: preshed<3.1.0,>=3.0.2 in /usr/local/lib/python3.10/dist-packages (from spacy<4->fastai<2.8,>=2.3.1->autogluon.tabular[all]==0.7.0->autogluon) (3.0.8)
Requirement already satisfied: thinc<8.2.0,>=8.1.8 in /usr/local/lib/python3.10/dist-packages (from spacy<4->fastai<2.8,>=2.3.1->autogluon.tabular[all]==0.7.0->autogluon) (8.1.9)
Requirement already satisfied: wasabi<1.2.0,>=0.9.1 in /usr/local/lib/python3.10/dist-packages (from spacy<4->fastai<2.8,>=2.3.1->autogluon.tabular[all]==0.7.0->autogluon) (1.1.1)
Requirement already satisfied: srsly<3.0.0,>=2.4.3 in /usr/local/lib/python3.10/dist-packages (from spacy<4->fastai<2.8,>=2.3.1->autogluon.tabular[all]==0.7.0->autogluon) (2.4.6)
Requirement already satisfied: catalogue<2.1.0,>=2.0.6 in /usr/local/lib/python3.10/dist-packages (from spacy<4->fastai<2.8,>=2.3.1->autogluon.tabular[all]==0.7.0->autogluon) (2.0.8)
Requirement already satisfied: typer<0.8.0,>=0.3.0 in /usr/local/lib/python3.10/dist-packages (from spacy<4->fastai<2.8,>=2.3.1->autogluon.tabular[all]==0.7.0->autogluon) (0.7.0)
Requirement already satisfied: pathy>=0.10.0 in /usr/local/lib/python3.10/dist-packages (from spacy<4->fastai<2.8,>=2.3.1->autogluon.tabular[all]==0.7.0->autogluon) (0.10.1)
Requirement already satisfied: smart-open<7.0.0,>=5.2.1 in /usr/local/lib/python3.10/dist-packages (from spacy<4->fastai<2.8,>=2.3.1->autogluon.tabular[all]==0.7.0->autogluon) (6.3.0)
Requirement already satisfied: langcodes<4.0.0,>=3.2.0 in /usr/local/lib/python3.10/dist-packages (from spacy<4->fastai<2.8,>=2.3.1->autogluon.tabular[all]==0.7.0->autogluon) (3.3.0)
Collecting distlib<1,>=0.3.6 (from virtualenv>=20.0.24->ray[tune]<2.3,>=2.2->autogluon.core[all]==0.7.0->autogluon)
Downloading distlib-0.3.6-py2.py3-none-any.whl (468 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 468.5/468.5 kB 391.0 MB/s eta 0:00:00
Requirement already satisfied: platformdirs<4,>=3.2 in /usr/local/lib/python3.10/dist-packages (from virtualenv>=20.0.24->ray[tune]<2.3,>=2.2->autogluon.core[all]==0.7.0->autogluon) (3.3.0)
Collecting ordered-set (from model-index->openmim<0.4.0,>0.1.5->autogluon.multimodal==0.7.0->autogluon)
Downloading ordered_set-4.1.0-py3-none-any.whl (7.6 kB)
Requirement already satisfied: tenacity>=6.2.0 in /usr/local/lib/python3.10/dist-packages (from plotly->catboost<1.2,>=1.0->autogluon.tabular[all]==0.7.0->autogluon) (8.2.2)
Requirement already satisfied: markdown-it-py<3.0.0,>=2.2.0 in /usr/local/lib/python3.10/dist-packages (from rich->openmim<0.4.0,>0.1.5->autogluon.multimodal==0.7.0->autogluon) (2.2.0)
Requirement already satisfied: pygments<3.0.0,>=2.13.0 in /usr/local/lib/python3.10/dist-packages (from rich->openmim<0.4.0,>0.1.5->autogluon.multimodal==0.7.0->autogluon) (2.14.0)
Collecting multidict<7.0,>=4.5 (from aiohttp->datasets>=2.0.0->evaluate<0.4.0,>=0.2.2->autogluon.multimodal==0.7.0->autogluon)
Downloading multidict-6.0.4-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (114 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 114.5/114.5 kB 357.7 MB/s eta 0:00:00
Collecting async-timeout<5.0,>=4.0.0a3 (from aiohttp->datasets>=2.0.0->evaluate<0.4.0,>=0.2.2->autogluon.multimodal==0.7.0->autogluon)
Downloading async_timeout-4.0.2-py3-none-any.whl (5.8 kB)
Collecting yarl<2.0,>=1.0 (from aiohttp->datasets>=2.0.0->evaluate<0.4.0,>=0.2.2->autogluon.multimodal==0.7.0->autogluon)
Downloading yarl-1.9.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (268 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 268.8/268.8 kB 389.6 MB/s eta 0:00:00
Requirement already satisfied: mdurl~=0.1 in /usr/local/lib/python3.10/dist-packages (from markdown-it-py<3.0.0,>=2.2.0->rich->openmim<0.4.0,>0.1.5->autogluon.multimodal==0.7.0->autogluon) (0.1.2)
Requirement already satisfied: pyasn1<0.6.0,>=0.4.6 in /usr/local/lib/python3.10/dist-packages (from pyasn1-modules>=0.2.1->google-auth<3,>=1.6.3->tensorboard<3,>=2.9->autogluon.multimodal==0.7.0->autogluon) (0.5.0)
Requirement already satisfied: oauthlib>=3.0.0 in /usr/local/lib/python3.10/dist-packages (from requests-oauthlib>=0.7.0->google-auth-oauthlib<1.1,>=0.5->tensorboard<3,>=2.9->autogluon.multimodal==0.7.0->autogluon) (3.2.2)
Requirement already satisfied: blis<0.8.0,>=0.7.8 in /usr/local/lib/python3.10/dist-packages (from thinc<8.2.0,>=8.1.8->spacy<4->fastai<2.8,>=2.3.1->autogluon.tabular[all]==0.7.0->autogluon) (0.7.9)
Requirement already satisfied: confection<1.0.0,>=0.0.1 in /usr/local/lib/python3.10/dist-packages (from thinc<8.2.0,>=8.1.8->spacy<4->fastai<2.8,>=2.3.1->autogluon.tabular[all]==0.7.0->autogluon) (0.0.4)
Requirement already satisfied: soupsieve>1.2 in /usr/local/lib/python3.10/dist-packages (from beautifulsoup4->gdown>=4.0.0->nlpaug<1.2.0,>=1.1.10->autogluon.multimodal==0.7.0->autogluon) (2.4.1)
Requirement already satisfied: PySocks!=1.5.7,>=1.5.6 in /usr/local/lib/python3.10/dist-packages (from requests->autogluon.core[all]==0.7.0->autogluon) (1.7.1)
Building wheels for collected packages: fairscale, antlr4-python3-runtime, seqeval
Building wheel for fairscale (pyproject.toml) ... done
Created wheel for fairscale: filename=fairscale-0.4.13-py3-none-any.whl size=332112 sha256=a531629021ec29c11f73894550cc5667b33e065431b7618b54f8c29d4f9010b5
Stored in directory: /tmp/pip-ephem-wheel-cache-770gmvoc/wheels/78/a4/c0/fb0a7ef03cff161611c3fa40c6cf898f76e58ec421b88e8cb3
Building wheel for antlr4-python3-runtime (setup.py) ... done
Created wheel for antlr4-python3-runtime: filename=antlr4_python3_runtime-4.9.3-py3-none-any.whl size=144554 sha256=014325e7548b38f26cc6b836e72b0eeeee30ae789ef09cec9210d9adb7438a32
Stored in directory: /tmp/pip-ephem-wheel-cache-770gmvoc/wheels/12/93/dd/1f6a127edc45659556564c5730f6d4e300888f4bca2d4c5a88
Building wheel for seqeval (setup.py) ... done
Created wheel for seqeval: filename=seqeval-1.2.2-py3-none-any.whl size=16165 sha256=cad3de8e8c540f91a192545bd221b57a939655f1088fc8d5d6a4c0612298f269
Stored in directory: /tmp/pip-ephem-wheel-cache-770gmvoc/wheels/1a/67/4a/ad4082dd7dfc30f2abfe4d80a2ed5926a506eb8a972b4767fa
Successfully built fairscale antlr4-python3-runtime seqeval
Installing collected packages: tokenizers, sentencepiece, distlib, antlr4-python3-runtime, xxhash, virtualenv, ujson, tensorboardX, pyDeprecate, Pillow, ordered-set, omegaconf, nvidia-cuda-runtime-cu11, nvidia-cuda-nvrtc-cu11, nvidia-cublas-cu11, nptyping, networkx, multidict, lightning-utilities, jsonschema, jmespath, frozenlist, dill, deprecated, colorama, async-timeout, yarl, responses, pytesseract, nvidia-cudnn-cu11, multiprocess, model-index, huggingface-hub, botocore, aiosignal, transformers, torch, seqeval, s3transfer, ray, openmim, gluonts, catboost, aiohttp, torchvision, torchmetrics, statsforecast, sktime, pytorch-metric-learning, pmdarima, nlpaug, fairscale, boto3, accelerate, timm, tbats, pytorch-lightning, datasets, autogluon.common, evaluate, autogluon.features, autogluon.core, autogluon.tabular, autogluon.multimodal, autogluon.timeseries, autogluon
Attempting uninstall: Pillow
Found existing installation: Pillow 8.4.0
Uninstalling Pillow-8.4.0:
Successfully uninstalled Pillow-8.4.0
Attempting uninstall: networkx
Found existing installation: networkx 3.1
Uninstalling networkx-3.1:
Successfully uninstalled networkx-3.1
Attempting uninstall: jsonschema
Found existing installation: jsonschema 4.3.3
Uninstalling jsonschema-4.3.3:
Successfully uninstalled jsonschema-4.3.3
Attempting uninstall: torch
Found existing installation: torch 2.0.1+cu118
Uninstalling torch-2.0.1+cu118:
Successfully uninstalled torch-2.0.1+cu118
Attempting uninstall: torchvision
Found existing installation: torchvision 0.15.2+cu118
Uninstalling torchvision-0.15.2+cu118:
Successfully uninstalled torchvision-0.15.2+cu118
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
panel 0.14.4 requires bokeh<2.5.0,>=2.4.0, but you have bokeh 2.0.1 which is incompatible.
torchaudio 2.0.2+cu118 requires torch==2.0.1, but you have torch 1.13.1 which is incompatible.
torchdata 0.6.1 requires torch==2.0.1, but you have torch 1.13.1 which is incompatible.
torchtext 0.15.2 requires torch==2.0.1, but you have torch 1.13.1 which is incompatible.
Successfully installed Pillow-9.5.0 accelerate-0.16.0 aiohttp-3.8.4 aiosignal-1.3.1 antlr4-python3-runtime-4.9.3 async-timeout-4.0.2 autogluon-0.7.0 autogluon.common-0.7.0 autogluon.core-0.7.0 autogluon.features-0.7.0 autogluon.multimodal-0.7.0 autogluon.tabular-0.7.0 autogluon.timeseries-0.7.0 boto3-1.26.141 botocore-1.29.141 catboost-1.1.1 colorama-0.4.6 datasets-2.12.0 deprecated-1.2.13 dill-0.3.6 distlib-0.3.6 evaluate-0.3.0 fairscale-0.4.13 frozenlist-1.3.3 gluonts-0.12.8 huggingface-hub-0.14.1 jmespath-1.0.1 jsonschema-4.17.3 lightning-utilities-0.8.0 model-index-0.1.11 multidict-6.0.4 multiprocess-0.70.14 networkx-2.8.8 nlpaug-1.1.11 nptyping-2.4.1 nvidia-cublas-cu11-11.10.3.66 nvidia-cuda-nvrtc-cu11-11.7.99 nvidia-cuda-runtime-cu11-11.7.99 nvidia-cudnn-cu11-8.5.0.96 omegaconf-2.2.3 openmim-0.3.7 ordered-set-4.1.0 pmdarima-1.8.5 pyDeprecate-0.3.2 pytesseract-0.3.10 pytorch-lightning-1.9.5 pytorch-metric-learning-1.7.3 ray-2.2.0 responses-0.18.0 s3transfer-0.6.1 sentencepiece-0.1.99 seqeval-1.2.2 sktime-0.15.1 statsforecast-1.4.0 tbats-1.1.3 tensorboardX-2.6 timm-0.6.13 tokenizers-0.13.3 torch-1.13.1 torchmetrics-0.8.2 torchvision-0.14.1 transformers-4.26.1 ujson-5.7.0 virtualenv-20.23.0 xxhash-3.2.0 yarl-1.9.2
# create the .kaggle directory and an empty kaggle.json file
!mkdir -p /root/.kaggle
!touch /root/.kaggle/kaggle.json
!chmod 600 /root/.kaggle/kaggle.json
# Fill in your user name and key from creating the kaggle account and API token file
import json
kaggle_username = "amitranjan27"
kaggle_key = "c57238bcf9d641b01667bf97c79e6cba"
# Save API token the kaggle.json file
with open("/root/.kaggle/kaggle.json", "w") as f:
f.write(json.dumps({"username": kaggle_username, "key": kaggle_key}))
!pip install kaggle
Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/ Requirement already satisfied: kaggle in /usr/local/lib/python3.10/dist-packages (1.5.13) Requirement already satisfied: six>=1.10 in /usr/local/lib/python3.10/dist-packages (from kaggle) (1.16.0) Requirement already satisfied: certifi in /usr/local/lib/python3.10/dist-packages (from kaggle) (2022.12.7) Requirement already satisfied: python-dateutil in /usr/local/lib/python3.10/dist-packages (from kaggle) (2.8.2) Requirement already satisfied: requests in /usr/local/lib/python3.10/dist-packages (from kaggle) (2.27.1) Requirement already satisfied: tqdm in /usr/local/lib/python3.10/dist-packages (from kaggle) (4.65.0) Requirement already satisfied: python-slugify in /usr/local/lib/python3.10/dist-packages (from kaggle) (8.0.1) Requirement already satisfied: urllib3 in /usr/local/lib/python3.10/dist-packages (from kaggle) (1.26.15) Requirement already satisfied: text-unidecode>=1.3 in /usr/local/lib/python3.10/dist-packages (from python-slugify->kaggle) (1.3) Requirement already satisfied: charset-normalizer~=2.0.0 in /usr/local/lib/python3.10/dist-packages (from requests->kaggle) (2.0.12) Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.10/dist-packages (from requests->kaggle) (3.4)
# Download the dataset, it will be in a .zip file so you'll need to unzip it as well.
!kaggle competitions download -c bike-sharing-demand
# If you already downloaded it you can use the -o command to overwrite the file
!unzip -o bike-sharing-demand.zip
Downloading bike-sharing-demand.zip to /content 100% 189k/189k [00:00<00:00, 287kB/s] 100% 189k/189k [00:00<00:00, 287kB/s] Archive: bike-sharing-demand.zip inflating: sampleSubmission.csv inflating: test.csv inflating: train.csv
import pandas as pd
from autogluon.tabular import TabularPredictor
import autogluon.core as ag
# Create the train dataset in pandas by reading the csv
# Set the parsing of the datetime column so you can use some of the `dt` features in pandas later
import pandas as pd
train = pd.read_csv("train.csv",parse_dates=['datetime'])
train.head()
| datetime | season | holiday | workingday | weather | temp | atemp | humidity | windspeed | casual | registered | count | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2011-01-01 00:00:00 | 1 | 0 | 0 | 1 | 9.84 | 14.395 | 81 | 0.0 | 3 | 13 | 16 |
| 1 | 2011-01-01 01:00:00 | 1 | 0 | 0 | 1 | 9.02 | 13.635 | 80 | 0.0 | 8 | 32 | 40 |
| 2 | 2011-01-01 02:00:00 | 1 | 0 | 0 | 1 | 9.02 | 13.635 | 80 | 0.0 | 5 | 27 | 32 |
| 3 | 2011-01-01 03:00:00 | 1 | 0 | 0 | 1 | 9.84 | 14.395 | 75 | 0.0 | 3 | 10 | 13 |
| 4 | 2011-01-01 04:00:00 | 1 | 0 | 0 | 1 | 9.84 | 14.395 | 75 | 0.0 | 0 | 1 | 1 |
import os
os.getcwd()
'/content'
# Simple output of the train dataset to view some of the min/max/varition of the dataset features.
train.describe()
| season | holiday | workingday | weather | temp | atemp | humidity | windspeed | casual | registered | count | hour | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| count | 10886.000000 | 10886.000000 | 10886.000000 | 10886.000000 | 10886.00000 | 10886.000000 | 10886.000000 | 10886.000000 | 10886.000000 | 10886.000000 | 10886.000000 | 10886.000000 |
| mean | 2.506614 | 0.028569 | 0.680875 | 1.418427 | 20.23086 | 23.655084 | 61.886460 | 12.799395 | 36.021955 | 155.552177 | 191.574132 | 11.541613 |
| std | 1.116174 | 0.166599 | 0.466159 | 0.633839 | 7.79159 | 8.474601 | 19.245033 | 8.164537 | 49.960477 | 151.039033 | 181.144454 | 6.915838 |
| min | 1.000000 | 0.000000 | 0.000000 | 1.000000 | 0.82000 | 0.760000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 1.000000 | 0.000000 |
| 25% | 2.000000 | 0.000000 | 0.000000 | 1.000000 | 13.94000 | 16.665000 | 47.000000 | 7.001500 | 4.000000 | 36.000000 | 42.000000 | 6.000000 |
| 50% | 3.000000 | 0.000000 | 1.000000 | 1.000000 | 20.50000 | 24.240000 | 62.000000 | 12.998000 | 17.000000 | 118.000000 | 145.000000 | 12.000000 |
| 75% | 4.000000 | 0.000000 | 1.000000 | 2.000000 | 26.24000 | 31.060000 | 77.000000 | 16.997900 | 49.000000 | 222.000000 | 284.000000 | 18.000000 |
| max | 4.000000 | 1.000000 | 1.000000 | 4.000000 | 41.00000 | 45.455000 | 100.000000 | 56.996900 | 367.000000 | 886.000000 | 977.000000 | 23.000000 |
# Create the test pandas dataframe in pandas by reading the csv, remember to parse the datetime!
test = pd.read_csv("test.csv",parse_dates=['datetime'])
test.head()
| datetime | season | holiday | workingday | weather | temp | atemp | humidity | windspeed | |
|---|---|---|---|---|---|---|---|---|---|
| 0 | 2011-01-20 00:00:00 | 1 | 0 | 1 | 1 | 10.66 | 11.365 | 56 | 26.0027 |
| 1 | 2011-01-20 01:00:00 | 1 | 0 | 1 | 1 | 10.66 | 13.635 | 56 | 0.0000 |
| 2 | 2011-01-20 02:00:00 | 1 | 0 | 1 | 1 | 10.66 | 13.635 | 56 | 0.0000 |
| 3 | 2011-01-20 03:00:00 | 1 | 0 | 1 | 1 | 10.66 | 12.880 | 56 | 11.0014 |
| 4 | 2011-01-20 04:00:00 | 1 | 0 | 1 | 1 | 10.66 | 12.880 | 56 | 11.0014 |
# Same thing as train and test dataset
submission = pd.read_csv("sampleSubmission.csv",parse_dates=['datetime'])
submission.head()
| datetime | count | |
|---|---|---|
| 0 | 2011-01-20 00:00:00 | 0 |
| 1 | 2011-01-20 01:00:00 | 0 |
| 2 | 2011-01-20 02:00:00 | 0 |
| 3 | 2011-01-20 03:00:00 | 0 |
| 4 | 2011-01-20 04:00:00 | 0 |
Requirements:
count, so it is the label we are setting.casual and registered columns as they are also not present in the test dataset. root_mean_squared_error as the metric to use for evaluation.best_quality to focus on creating the best model.predictor = TabularPredictor(label='count',
problem_type='regression',
path='/content/drive/MyDrive',
eval_metric='root_mean_squared_error').fit(train_data = train.drop(['casual', 'registered'],axis=1),
time_limit=600,
presets='best_quality',
ag_args_fit={'num_gpus': 1}
)
Warning: path already exists! This predictor may overwrite an existing predictor! path="/content/drive/MyDrive"
Presets specified: ['best_quality']
Stack configuration (auto_stack=True): num_stack_levels=1, num_bag_folds=8, num_bag_sets=20
Beginning AutoGluon training ... Time limit = 600s
AutoGluon will save models to "/content/drive/MyDrive/"
AutoGluon Version: 0.7.0
Python Version: 3.10.11
Operating System: Linux
Platform Machine: x86_64
Platform Version: #1 SMP Sat Apr 29 09:15:28 UTC 2023
Train Data Rows: 10886
Train Data Columns: 9
Label Column: count
Preprocessing data ...
Using Feature Generators to preprocess the data ...
Fitting AutoMLPipelineFeatureGenerator...
Available Memory: 12464.02 MB
Train Data (Original) Memory Usage: 0.78 MB (0.0% of available memory)
Inferring data type of each feature based on column values. Set feature_metadata_in to manually specify special dtypes of the features.
Stage 1 Generators:
Fitting AsTypeFeatureGenerator...
Note: Converting 2 features to boolean dtype as they only contain 2 unique values.
Stage 2 Generators:
Fitting FillNaFeatureGenerator...
Stage 3 Generators:
Fitting IdentityFeatureGenerator...
Fitting DatetimeFeatureGenerator...
Stage 4 Generators:
Fitting DropUniqueFeatureGenerator...
Types of features in original data (raw dtype, special dtypes):
('datetime', []) : 1 | ['datetime']
('float', []) : 3 | ['temp', 'atemp', 'windspeed']
('int', []) : 5 | ['season', 'holiday', 'workingday', 'weather', 'humidity']
Types of features in processed data (raw dtype, special dtypes):
('float', []) : 3 | ['temp', 'atemp', 'windspeed']
('int', []) : 3 | ['season', 'weather', 'humidity']
('int', ['bool']) : 2 | ['holiday', 'workingday']
('int', ['datetime_as_int']) : 5 | ['datetime', 'datetime.year', 'datetime.month', 'datetime.day', 'datetime.dayofweek']
0.1s = Fit runtime
9 features in original data used to generate 13 features in processed data.
Train Data (Processed) Memory Usage: 0.98 MB (0.0% of available memory)
Data preprocessing and feature engineering runtime = 0.17s ...
AutoGluon will gauge predictive performance using evaluation metric: 'root_mean_squared_error'
This metric's sign has been flipped to adhere to being higher_is_better. The metric score can be multiplied by -1 to get the metric value.
To change this, specify the eval_metric parameter of Predictor()
AutoGluon will fit 2 stack levels (L1 to L2) ...
Fitting 11 L1 models ...
Fitting model: KNeighborsUnif_BAG_L1 ... Training model for up to 399.78s of the 599.82s of remaining time.
-101.5462 = Validation score (-root_mean_squared_error)
0.04s = Training runtime
0.05s = Validation runtime
Fitting model: KNeighborsDist_BAG_L1 ... Training model for up to 399.61s of the 599.65s of remaining time.
-84.1251 = Validation score (-root_mean_squared_error)
0.03s = Training runtime
0.04s = Validation runtime
Fitting model: LightGBMXT_BAG_L1 ... Training model for up to 399.45s of the 599.49s of remaining time.
Fitting 8 child models (S1F1 - S1F8) | Fitting with ParallelLocalFoldFittingStrategy
-131.4609 = Validation score (-root_mean_squared_error)
105.21s = Training runtime
6.43s = Validation runtime
Fitting model: LightGBM_BAG_L1 ... Training model for up to 284.95s of the 484.99s of remaining time.
Fitting 8 child models (S1F1 - S1F8) | Fitting with ParallelLocalFoldFittingStrategy
-131.0542 = Validation score (-root_mean_squared_error)
40.33s = Training runtime
1.08s = Validation runtime
Fitting model: RandomForestMSE_BAG_L1 ... Training model for up to 240.75s of the 440.79s of remaining time.
-116.5484 = Validation score (-root_mean_squared_error)
18.44s = Training runtime
0.63s = Validation runtime
Fitting model: CatBoost_BAG_L1 ... Training model for up to 220.37s of the 420.41s of remaining time.
Fitting 8 child models (S1F1 - S1F8) | Fitting with ParallelLocalFoldFittingStrategy
-136.5203 = Validation score (-root_mean_squared_error)
153.48s = Training runtime
0.04s = Validation runtime
Fitting model: ExtraTreesMSE_BAG_L1 ... Training model for up to 63.89s of the 263.92s of remaining time.
-124.6007 = Validation score (-root_mean_squared_error)
4.78s = Training runtime
0.64s = Validation runtime
Fitting model: NeuralNetFastAI_BAG_L1 ... Training model for up to 57.11s of the 257.15s of remaining time.
Fitting 8 child models (S1F1 - S1F8) | Fitting with ParallelLocalFoldFittingStrategy
-150.3271 = Validation score (-root_mean_squared_error)
79.35s = Training runtime
0.23s = Validation runtime
Completed 1/20 k-fold bagging repeats ...
Fitting model: WeightedEnsemble_L2 ... Training model for up to 360.0s of the 174.77s of remaining time.
-84.1251 = Validation score (-root_mean_squared_error)
0.75s = Training runtime
0.0s = Validation runtime
Fitting 9 L2 models ...
Fitting model: LightGBMXT_BAG_L2 ... Training model for up to 173.97s of the 173.94s of remaining time.
Fitting 8 child models (S1F1 - S1F8) | Fitting with ParallelLocalFoldFittingStrategy
-60.6097 = Validation score (-root_mean_squared_error)
73.42s = Training runtime
3.03s = Validation runtime
Fitting model: LightGBM_BAG_L2 ... Training model for up to 96.19s of the 96.16s of remaining time.
Fitting 8 child models (S1F1 - S1F8) | Fitting with ParallelLocalFoldFittingStrategy
-54.8572 = Validation score (-root_mean_squared_error)
36.76s = Training runtime
0.23s = Validation runtime
Fitting model: RandomForestMSE_BAG_L2 ... Training model for up to 57.14s of the 57.12s of remaining time.
-53.345 = Validation score (-root_mean_squared_error)
36.75s = Training runtime
0.59s = Validation runtime
Fitting model: CatBoost_BAG_L2 ... Training model for up to 18.23s of the 18.21s of remaining time.
Fitting 8 child models (S1F1 - S1F8) | Fitting with ParallelLocalFoldFittingStrategy
2023-05-26 15:51:12,379 ERROR serialization.py:371 -- Failed to unpickle serialized exception
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/ray/exceptions.py", line 46, in from_ray_exception
return pickle.loads(ray_exception.serialized_exception)
ModuleNotFoundError: No module named '_catboost'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/ray/_private/serialization.py", line 369, in deserialize_objects
obj = self._deserialize_object(data, metadata, object_ref)
File "/usr/local/lib/python3.10/dist-packages/ray/_private/serialization.py", line 275, in _deserialize_object
return RayError.from_bytes(obj)
File "/usr/local/lib/python3.10/dist-packages/ray/exceptions.py", line 40, in from_bytes
return RayError.from_ray_exception(ray_exception)
File "/usr/local/lib/python3.10/dist-packages/ray/exceptions.py", line 49, in from_ray_exception
raise RuntimeError(msg) from e
RuntimeError: Failed to unpickle serialized exception
Warning: Exception caused CatBoost_BAG_L2 to fail during training... Skipping this model.
System error: Failed to unpickle serialized exception
traceback: Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/ray/exceptions.py", line 46, in from_ray_exception
return pickle.loads(ray_exception.serialized_exception)
ModuleNotFoundError: No module named '_catboost'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/ray/_private/serialization.py", line 369, in deserialize_objects
obj = self._deserialize_object(data, metadata, object_ref)
File "/usr/local/lib/python3.10/dist-packages/ray/_private/serialization.py", line 275, in _deserialize_object
return RayError.from_bytes(obj)
File "/usr/local/lib/python3.10/dist-packages/ray/exceptions.py", line 40, in from_bytes
return RayError.from_ray_exception(ray_exception)
File "/usr/local/lib/python3.10/dist-packages/ray/exceptions.py", line 49, in from_ray_exception
raise RuntimeError(msg) from e
RuntimeError: Failed to unpickle serialized exception
Detailed Traceback:
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/trainer/abstract_trainer.py", line 1502, in _train_and_save
model = self._train_single(X, y, model, X_val, y_val, total_resources=total_resources, **model_fit_kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/trainer/abstract_trainer.py", line 1447, in _train_single
model = model.fit(X=X, y=y, X_val=X_val, y_val=y_val, total_resources=total_resources, **model_fit_kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/ensemble/stacker_ensemble_model.py", line 154, in _fit
return super()._fit(X=X, y=y, time_limit=time_limit, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/ensemble/bagged_ensemble_model.py", line 248, in _fit
self._fit_folds(X=X, y=y, model_base=model_base, X_pseudo=X_pseudo, y_pseudo=y_pseudo,
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/ensemble/bagged_ensemble_model.py", line 540, in _fit_folds
fold_fitting_strategy.after_all_folds_scheduled()
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/ensemble/fold_fitting_strategy.py", line 537, in after_all_folds_scheduled
raise processed_exception
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/ensemble/fold_fitting_strategy.py", line 505, in after_all_folds_scheduled
time_end_fit, predict_time, predict_1_time = self.ray.get(finished)
File "/usr/local/lib/python3.10/dist-packages/ray/_private/client_mode_hook.py", line 105, in wrapper
return func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/ray/_private/worker.py", line 2311, in get
raise value
ray.exceptions.RaySystemError: System error: Failed to unpickle serialized exception
traceback: Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/ray/exceptions.py", line 46, in from_ray_exception
return pickle.loads(ray_exception.serialized_exception)
ModuleNotFoundError: No module named '_catboost'
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/ray/_private/serialization.py", line 369, in deserialize_objects
obj = self._deserialize_object(data, metadata, object_ref)
File "/usr/local/lib/python3.10/dist-packages/ray/_private/serialization.py", line 275, in _deserialize_object
return RayError.from_bytes(obj)
File "/usr/local/lib/python3.10/dist-packages/ray/exceptions.py", line 40, in from_bytes
return RayError.from_ray_exception(ray_exception)
File "/usr/local/lib/python3.10/dist-packages/ray/exceptions.py", line 49, in from_ray_exception
raise RuntimeError(msg) from e
RuntimeError: Failed to unpickle serialized exception
Fitting model: ExtraTreesMSE_BAG_L2 ... Training model for up to 9.14s of the 9.12s of remaining time.
-53.7367 = Validation score (-root_mean_squared_error)
12.68s = Training runtime
0.55s = Validation runtime
Completed 1/20 k-fold bagging repeats ...
Fitting model: WeightedEnsemble_L3 ... Training model for up to 360.0s of the -5.13s of remaining time.
-52.7118 = Validation score (-root_mean_squared_error)
0.23s = Training runtime
0.0s = Validation runtime
AutoGluon training complete, total runtime = 605.75s ... Best model: "WeightedEnsemble_L3"
TabularPredictor saved. To load, use: predictor = TabularPredictor.load("/content/drive/MyDrive/")
predictor.fit_summary()
*** Summary of fit() ***
Estimated performance of each model:
model score_val pred_time_val fit_time pred_time_val_marginal fit_time_marginal stack_level can_infer fit_order
0 WeightedEnsemble_L3 -52.711828 10.524795 488.078481 0.000913 0.233405 3 True 14
1 RandomForestMSE_BAG_L2 -53.345036 9.741804 438.411020 0.588712 36.746856 2 True 12
2 ExtraTreesMSE_BAG_L2 -53.736738 9.704154 414.342361 0.551062 12.678197 2 True 13
3 LightGBM_BAG_L2 -54.857239 9.384108 438.420023 0.231016 36.755860 2 True 11
4 LightGBMXT_BAG_L2 -60.609705 12.187556 475.079818 3.034464 73.415654 2 True 10
5 KNeighborsDist_BAG_L1 -84.125061 0.039281 0.031396 0.039281 0.031396 1 True 2
6 WeightedEnsemble_L2 -84.125061 0.040278 0.780146 0.000997 0.748750 2 True 9
7 KNeighborsUnif_BAG_L1 -101.546199 0.049865 0.044547 0.049865 0.044547 1 True 1
8 RandomForestMSE_BAG_L1 -116.548359 0.632007 18.440346 0.632007 18.440346 1 True 5
9 ExtraTreesMSE_BAG_L1 -124.600676 0.643144 4.775705 0.643144 4.775705 1 True 7
10 LightGBM_BAG_L1 -131.054162 1.081594 40.332702 1.081594 40.332702 1 True 4
11 LightGBMXT_BAG_L1 -131.460909 6.434841 105.211925 6.434841 105.211925 1 True 3
12 CatBoost_BAG_L1 -136.520280 0.039436 153.479697 0.039436 153.479697 1 True 6
13 NeuralNetFastAI_BAG_L1 -150.327073 0.232926 79.347846 0.232926 79.347846 1 True 8
Number of models trained: 14
Types of models trained:
{'StackerEnsembleModel_LGB', 'StackerEnsembleModel_RF', 'StackerEnsembleModel_XT', 'StackerEnsembleModel_KNN', 'WeightedEnsembleModel', 'StackerEnsembleModel_NNFastAiTabular', 'StackerEnsembleModel_CatBoost'}
Bagging used: True (with 8 folds)
Multi-layer stack-ensembling used: True (with 3 levels)
Feature Metadata (Processed):
(raw dtype, special dtypes):
('float', []) : 3 | ['temp', 'atemp', 'windspeed']
('int', []) : 3 | ['season', 'weather', 'humidity']
('int', ['bool']) : 2 | ['holiday', 'workingday']
('int', ['datetime_as_int']) : 5 | ['datetime', 'datetime.year', 'datetime.month', 'datetime.day', 'datetime.dayofweek']
*** End of fit() summary ***
/usr/local/lib/python3.10/dist-packages/autogluon/core/utils/plots.py:138: UserWarning: AutoGluon summary plots cannot be created because bokeh is not installed. To see plots, please do: "pip install bokeh==2.0.1"
warnings.warn('AutoGluon summary plots cannot be created because bokeh is not installed. To see plots, please do: "pip install bokeh==2.0.1"')
{'model_types': {'KNeighborsUnif_BAG_L1': 'StackerEnsembleModel_KNN',
'KNeighborsDist_BAG_L1': 'StackerEnsembleModel_KNN',
'LightGBMXT_BAG_L1': 'StackerEnsembleModel_LGB',
'LightGBM_BAG_L1': 'StackerEnsembleModel_LGB',
'RandomForestMSE_BAG_L1': 'StackerEnsembleModel_RF',
'CatBoost_BAG_L1': 'StackerEnsembleModel_CatBoost',
'ExtraTreesMSE_BAG_L1': 'StackerEnsembleModel_XT',
'NeuralNetFastAI_BAG_L1': 'StackerEnsembleModel_NNFastAiTabular',
'WeightedEnsemble_L2': 'WeightedEnsembleModel',
'LightGBMXT_BAG_L2': 'StackerEnsembleModel_LGB',
'LightGBM_BAG_L2': 'StackerEnsembleModel_LGB',
'RandomForestMSE_BAG_L2': 'StackerEnsembleModel_RF',
'ExtraTreesMSE_BAG_L2': 'StackerEnsembleModel_XT',
'WeightedEnsemble_L3': 'WeightedEnsembleModel'},
'model_performance': {'KNeighborsUnif_BAG_L1': -101.54619908446061,
'KNeighborsDist_BAG_L1': -84.12506123181602,
'LightGBMXT_BAG_L1': -131.46090891834504,
'LightGBM_BAG_L1': -131.054161598899,
'RandomForestMSE_BAG_L1': -116.54835939455667,
'CatBoost_BAG_L1': -136.52027987483837,
'ExtraTreesMSE_BAG_L1': -124.60067564699747,
'NeuralNetFastAI_BAG_L1': -150.327073054153,
'WeightedEnsemble_L2': -84.12506123181602,
'LightGBMXT_BAG_L2': -60.60970530923543,
'LightGBM_BAG_L2': -54.85723921824166,
'RandomForestMSE_BAG_L2': -53.345035675502345,
'ExtraTreesMSE_BAG_L2': -53.73673838605837,
'WeightedEnsemble_L3': -52.71182788176593},
'model_best': 'WeightedEnsemble_L3',
'model_paths': {'KNeighborsUnif_BAG_L1': '/content/drive/MyDrive/models/KNeighborsUnif_BAG_L1/',
'KNeighborsDist_BAG_L1': '/content/drive/MyDrive/models/KNeighborsDist_BAG_L1/',
'LightGBMXT_BAG_L1': '/content/drive/MyDrive/models/LightGBMXT_BAG_L1/',
'LightGBM_BAG_L1': '/content/drive/MyDrive/models/LightGBM_BAG_L1/',
'RandomForestMSE_BAG_L1': '/content/drive/MyDrive/models/RandomForestMSE_BAG_L1/',
'CatBoost_BAG_L1': '/content/drive/MyDrive/models/CatBoost_BAG_L1/',
'ExtraTreesMSE_BAG_L1': '/content/drive/MyDrive/models/ExtraTreesMSE_BAG_L1/',
'NeuralNetFastAI_BAG_L1': '/content/drive/MyDrive/models/NeuralNetFastAI_BAG_L1/',
'WeightedEnsemble_L2': '/content/drive/MyDrive/models/WeightedEnsemble_L2/',
'LightGBMXT_BAG_L2': '/content/drive/MyDrive/models/LightGBMXT_BAG_L2/',
'LightGBM_BAG_L2': '/content/drive/MyDrive/models/LightGBM_BAG_L2/',
'RandomForestMSE_BAG_L2': '/content/drive/MyDrive/models/RandomForestMSE_BAG_L2/',
'ExtraTreesMSE_BAG_L2': '/content/drive/MyDrive/models/ExtraTreesMSE_BAG_L2/',
'WeightedEnsemble_L3': '/content/drive/MyDrive/models/WeightedEnsemble_L3/'},
'model_fit_times': {'KNeighborsUnif_BAG_L1': 0.04454660415649414,
'KNeighborsDist_BAG_L1': 0.03139615058898926,
'LightGBMXT_BAG_L1': 105.21192502975464,
'LightGBM_BAG_L1': 40.332701683044434,
'RandomForestMSE_BAG_L1': 18.440346240997314,
'CatBoost_BAG_L1': 153.47969675064087,
'ExtraTreesMSE_BAG_L1': 4.775705099105835,
'NeuralNetFastAI_BAG_L1': 79.34784603118896,
'WeightedEnsemble_L2': 0.7487502098083496,
'LightGBMXT_BAG_L2': 73.41565442085266,
'LightGBM_BAG_L2': 36.75585961341858,
'RandomForestMSE_BAG_L2': 36.74685597419739,
'ExtraTreesMSE_BAG_L2': 12.678197145462036,
'WeightedEnsemble_L3': 0.23340487480163574},
'model_pred_times': {'KNeighborsUnif_BAG_L1': 0.04986453056335449,
'KNeighborsDist_BAG_L1': 0.03928112983703613,
'LightGBMXT_BAG_L1': 6.434841156005859,
'LightGBM_BAG_L1': 1.0815935134887695,
'RandomForestMSE_BAG_L1': 0.6320068836212158,
'CatBoost_BAG_L1': 0.039435625076293945,
'ExtraTreesMSE_BAG_L1': 0.6431436538696289,
'NeuralNetFastAI_BAG_L1': 0.2329256534576416,
'WeightedEnsemble_L2': 0.0009965896606445312,
'LightGBMXT_BAG_L2': 3.034464120864868,
'LightGBM_BAG_L2': 0.23101568222045898,
'RandomForestMSE_BAG_L2': 0.5887119770050049,
'ExtraTreesMSE_BAG_L2': 0.5510618686676025,
'WeightedEnsemble_L3': 0.0009133815765380859},
'num_bag_folds': 8,
'max_stack_level': 3,
'model_hyperparams': {'KNeighborsUnif_BAG_L1': {'use_orig_features': True,
'max_base_models': 25,
'max_base_models_per_type': 5,
'save_bag_folds': True,
'use_child_oof': True},
'KNeighborsDist_BAG_L1': {'use_orig_features': True,
'max_base_models': 25,
'max_base_models_per_type': 5,
'save_bag_folds': True,
'use_child_oof': True},
'LightGBMXT_BAG_L1': {'use_orig_features': True,
'max_base_models': 25,
'max_base_models_per_type': 5,
'save_bag_folds': True},
'LightGBM_BAG_L1': {'use_orig_features': True,
'max_base_models': 25,
'max_base_models_per_type': 5,
'save_bag_folds': True},
'RandomForestMSE_BAG_L1': {'use_orig_features': True,
'max_base_models': 25,
'max_base_models_per_type': 5,
'save_bag_folds': True,
'use_child_oof': True},
'CatBoost_BAG_L1': {'use_orig_features': True,
'max_base_models': 25,
'max_base_models_per_type': 5,
'save_bag_folds': True},
'ExtraTreesMSE_BAG_L1': {'use_orig_features': True,
'max_base_models': 25,
'max_base_models_per_type': 5,
'save_bag_folds': True,
'use_child_oof': True},
'NeuralNetFastAI_BAG_L1': {'use_orig_features': True,
'max_base_models': 25,
'max_base_models_per_type': 5,
'save_bag_folds': True},
'WeightedEnsemble_L2': {'use_orig_features': False,
'max_base_models': 25,
'max_base_models_per_type': 5,
'save_bag_folds': True},
'LightGBMXT_BAG_L2': {'use_orig_features': True,
'max_base_models': 25,
'max_base_models_per_type': 5,
'save_bag_folds': True},
'LightGBM_BAG_L2': {'use_orig_features': True,
'max_base_models': 25,
'max_base_models_per_type': 5,
'save_bag_folds': True},
'RandomForestMSE_BAG_L2': {'use_orig_features': True,
'max_base_models': 25,
'max_base_models_per_type': 5,
'save_bag_folds': True,
'use_child_oof': True},
'ExtraTreesMSE_BAG_L2': {'use_orig_features': True,
'max_base_models': 25,
'max_base_models_per_type': 5,
'save_bag_folds': True,
'use_child_oof': True},
'WeightedEnsemble_L3': {'use_orig_features': False,
'max_base_models': 25,
'max_base_models_per_type': 5,
'save_bag_folds': True}},
'leaderboard': model score_val pred_time_val fit_time \
0 WeightedEnsemble_L3 -52.711828 10.524795 488.078481
1 RandomForestMSE_BAG_L2 -53.345036 9.741804 438.411020
2 ExtraTreesMSE_BAG_L2 -53.736738 9.704154 414.342361
3 LightGBM_BAG_L2 -54.857239 9.384108 438.420023
4 LightGBMXT_BAG_L2 -60.609705 12.187556 475.079818
5 KNeighborsDist_BAG_L1 -84.125061 0.039281 0.031396
6 WeightedEnsemble_L2 -84.125061 0.040278 0.780146
7 KNeighborsUnif_BAG_L1 -101.546199 0.049865 0.044547
8 RandomForestMSE_BAG_L1 -116.548359 0.632007 18.440346
9 ExtraTreesMSE_BAG_L1 -124.600676 0.643144 4.775705
10 LightGBM_BAG_L1 -131.054162 1.081594 40.332702
11 LightGBMXT_BAG_L1 -131.460909 6.434841 105.211925
12 CatBoost_BAG_L1 -136.520280 0.039436 153.479697
13 NeuralNetFastAI_BAG_L1 -150.327073 0.232926 79.347846
pred_time_val_marginal fit_time_marginal stack_level can_infer \
0 0.000913 0.233405 3 True
1 0.588712 36.746856 2 True
2 0.551062 12.678197 2 True
3 0.231016 36.755860 2 True
4 3.034464 73.415654 2 True
5 0.039281 0.031396 1 True
6 0.000997 0.748750 2 True
7 0.049865 0.044547 1 True
8 0.632007 18.440346 1 True
9 0.643144 4.775705 1 True
10 1.081594 40.332702 1 True
11 6.434841 105.211925 1 True
12 0.039436 153.479697 1 True
13 0.232926 79.347846 1 True
fit_order
0 14
1 12
2 13
3 11
4 10
5 2
6 9
7 1
8 5
9 7
10 4
11 3
12 6
13 8 }
# Output the model's `score_val` in a bar chart to compare performance
predictor.leaderboard(silent=True).plot(kind="bar", x="model", y="score_val")
<Axes: xlabel='model'>
performance = predictor.evaluate(train)
/usr/local/lib/python3.10/dist-packages/autogluon/tabular/predictor/predictor.py:1420: FutureWarning: Calling `predictor.predict_proba` when problem_type=regression will raise an AssertionError starting in AutoGluon v0.8. Please call `predictor.predict` instead.
warnings.warn(
Evaluation: root_mean_squared_error on test data: -71.44761713042256
Note: Scores are always higher_is_better. This metric score can be multiplied by -1 to get the metric value.
Evaluations on test data:
{
"root_mean_squared_error": -71.44761713042256,
"mean_squared_error": -5104.7619936154515,
"mean_absolute_error": -48.78840069471421,
"r2": 0.8444158643200542,
"pearsonr": 0.9458780588572309,
"median_absolute_error": -29.097787857055664
}
predictions = predictor.predict(test)
predictions.head()
0 23.611856 1 38.397758 2 42.997002 3 47.795578 4 51.163578 Name: count, dtype: float32
# Describe the `predictions` series to see if there are any negative values
predictions.describe()
count 6493.000000 mean 100.693581 std 90.041473 min 2.923446 25% 20.537388 50% 63.407089 75% 169.359085 max 365.016479 Name: count, dtype: float64
# How many negative values do we have?
predictions[predictions<0].sum()
0.0
# Set them to zero
predictions[predictions<0]=0
submission["count"] = predictions
submission.to_csv("submission.csv", index=False)
!kaggle competitions submit -c bike-sharing-demand -f submission.csv -m "first raw submission"
100% 188k/188k [00:01<00:00, 107kB/s] Successfully submitted to Bike Sharing Demand
My Submissions¶!kaggle competitions submissions -c bike-sharing-demand | tail -n +1 | head -n 6
fileName date description status publicScore privateScore --------------------------- ------------------- ---------------------------------- -------- ----------- ------------ submission.csv 2023-05-26 15:53:07 first raw submission complete 1.80414 1.80414 submission_new_features.csv 2023-05-26 15:25:32 new features complete 0.71055 0.71055 submission.csv 2023-05-26 15:13:41 first raw submission complete 1.79631 1.79631 submission_new_hpo.csv 2023-05-26 14:22:40 new features with hyperparameters complete 0.46463 0.46463
# Create a histogram of all features to show the distribution of each one relative to the data. This is part of the exploritory data analysis
train.hist(figsize=(25,25))
array([[<Axes: title={'center': 'datetime'}>,
<Axes: title={'center': 'season'}>,
<Axes: title={'center': 'holiday'}>],
[<Axes: title={'center': 'workingday'}>,
<Axes: title={'center': 'weather'}>,
<Axes: title={'center': 'temp'}>],
[<Axes: title={'center': 'atemp'}>,
<Axes: title={'center': 'humidity'}>,
<Axes: title={'center': 'windspeed'}>],
[<Axes: title={'center': 'casual'}>,
<Axes: title={'center': 'registered'}>,
<Axes: title={'center': 'count'}>]], dtype=object)
!pip install pandas-profiling
Looking in indexes: https://pypi.org/simple, https://us-python.pkg.dev/colab-wheels/public/simple/
Collecting pandas-profiling
Downloading pandas_profiling-3.6.6-py2.py3-none-any.whl (324 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 324.4/324.4 kB 23.5 MB/s eta 0:00:00
Collecting ydata-profiling (from pandas-profiling)
Downloading ydata_profiling-4.2.0-py2.py3-none-any.whl (352 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 352.3/352.3 kB 29.9 MB/s eta 0:00:00
Requirement already satisfied: scipy<1.11,>=1.4.1 in /usr/local/lib/python3.10/dist-packages (from ydata-profiling->pandas-profiling) (1.10.1)
Requirement already satisfied: pandas!=1.4.0,<2,>1.1 in /usr/local/lib/python3.10/dist-packages (from ydata-profiling->pandas-profiling) (1.5.3)
Requirement already satisfied: matplotlib<4,>=3.2 in /usr/local/lib/python3.10/dist-packages (from ydata-profiling->pandas-profiling) (3.7.1)
Requirement already satisfied: pydantic<2,>=1.8.1 in /usr/local/lib/python3.10/dist-packages (from ydata-profiling->pandas-profiling) (1.10.7)
Requirement already satisfied: PyYAML<6.1,>=5.0.0 in /usr/local/lib/python3.10/dist-packages (from ydata-profiling->pandas-profiling) (6.0)
Requirement already satisfied: jinja2<3.2,>=2.11.1 in /usr/local/lib/python3.10/dist-packages (from ydata-profiling->pandas-profiling) (3.1.2)
Collecting visions[type_image_path]==0.7.5 (from ydata-profiling->pandas-profiling)
Downloading visions-0.7.5-py3-none-any.whl (102 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 102.7/102.7 kB 14.4 MB/s eta 0:00:00
Requirement already satisfied: numpy<1.24,>=1.16.0 in /usr/local/lib/python3.10/dist-packages (from ydata-profiling->pandas-profiling) (1.22.4)
Collecting htmlmin==0.1.12 (from ydata-profiling->pandas-profiling)
Downloading htmlmin-0.1.12.tar.gz (19 kB)
Preparing metadata (setup.py) ... done
Collecting phik<0.13,>=0.11.1 (from ydata-profiling->pandas-profiling)
Downloading phik-0.12.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (679 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 679.5/679.5 kB 55.7 MB/s eta 0:00:00
Requirement already satisfied: requests<3,>=2.24.0 in /usr/local/lib/python3.10/dist-packages (from ydata-profiling->pandas-profiling) (2.27.1)
Requirement already satisfied: tqdm<5,>=4.48.2 in /usr/local/lib/python3.10/dist-packages (from ydata-profiling->pandas-profiling) (4.65.0)
Requirement already satisfied: seaborn<0.13,>=0.10.1 in /usr/local/lib/python3.10/dist-packages (from ydata-profiling->pandas-profiling) (0.12.2)
Collecting multimethod<2,>=1.4 (from ydata-profiling->pandas-profiling)
Downloading multimethod-1.9.1-py3-none-any.whl (10 kB)
Requirement already satisfied: statsmodels<1,>=0.13.2 in /usr/local/lib/python3.10/dist-packages (from ydata-profiling->pandas-profiling) (0.13.5)
Collecting typeguard<3,>=2.13.2 (from ydata-profiling->pandas-profiling)
Downloading typeguard-2.13.3-py3-none-any.whl (17 kB)
Collecting imagehash==4.3.1 (from ydata-profiling->pandas-profiling)
Downloading ImageHash-4.3.1-py2.py3-none-any.whl (296 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 296.5/296.5 kB 35.1 MB/s eta 0:00:00
Collecting wordcloud>=1.9.1 (from ydata-profiling->pandas-profiling)
Downloading wordcloud-1.9.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (455 kB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 455.4/455.4 kB 48.1 MB/s eta 0:00:00
Collecting dacite>=1.8 (from ydata-profiling->pandas-profiling)
Downloading dacite-1.8.1-py3-none-any.whl (14 kB)
Requirement already satisfied: PyWavelets in /usr/local/lib/python3.10/dist-packages (from imagehash==4.3.1->ydata-profiling->pandas-profiling) (1.4.1)
Requirement already satisfied: pillow in /usr/local/lib/python3.10/dist-packages (from imagehash==4.3.1->ydata-profiling->pandas-profiling) (8.4.0)
Requirement already satisfied: attrs>=19.3.0 in /usr/local/lib/python3.10/dist-packages (from visions[type_image_path]==0.7.5->ydata-profiling->pandas-profiling) (23.1.0)
Requirement already satisfied: networkx>=2.4 in /usr/local/lib/python3.10/dist-packages (from visions[type_image_path]==0.7.5->ydata-profiling->pandas-profiling) (3.1)
Collecting tangled-up-in-unicode>=0.0.4 (from visions[type_image_path]==0.7.5->ydata-profiling->pandas-profiling)
Downloading tangled_up_in_unicode-0.2.0-py3-none-any.whl (4.7 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 4.7/4.7 MB 89.1 MB/s eta 0:00:00
Requirement already satisfied: MarkupSafe>=2.0 in /usr/local/lib/python3.10/dist-packages (from jinja2<3.2,>=2.11.1->ydata-profiling->pandas-profiling) (2.1.2)
Requirement already satisfied: contourpy>=1.0.1 in /usr/local/lib/python3.10/dist-packages (from matplotlib<4,>=3.2->ydata-profiling->pandas-profiling) (1.0.7)
Requirement already satisfied: cycler>=0.10 in /usr/local/lib/python3.10/dist-packages (from matplotlib<4,>=3.2->ydata-profiling->pandas-profiling) (0.11.0)
Requirement already satisfied: fonttools>=4.22.0 in /usr/local/lib/python3.10/dist-packages (from matplotlib<4,>=3.2->ydata-profiling->pandas-profiling) (4.39.3)
Requirement already satisfied: kiwisolver>=1.0.1 in /usr/local/lib/python3.10/dist-packages (from matplotlib<4,>=3.2->ydata-profiling->pandas-profiling) (1.4.4)
Requirement already satisfied: packaging>=20.0 in /usr/local/lib/python3.10/dist-packages (from matplotlib<4,>=3.2->ydata-profiling->pandas-profiling) (23.1)
Requirement already satisfied: pyparsing>=2.3.1 in /usr/local/lib/python3.10/dist-packages (from matplotlib<4,>=3.2->ydata-profiling->pandas-profiling) (3.0.9)
Requirement already satisfied: python-dateutil>=2.7 in /usr/local/lib/python3.10/dist-packages (from matplotlib<4,>=3.2->ydata-profiling->pandas-profiling) (2.8.2)
Requirement already satisfied: pytz>=2020.1 in /usr/local/lib/python3.10/dist-packages (from pandas!=1.4.0,<2,>1.1->ydata-profiling->pandas-profiling) (2022.7.1)
Requirement already satisfied: joblib>=0.14.1 in /usr/local/lib/python3.10/dist-packages (from phik<0.13,>=0.11.1->ydata-profiling->pandas-profiling) (1.2.0)
Requirement already satisfied: typing-extensions>=4.2.0 in /usr/local/lib/python3.10/dist-packages (from pydantic<2,>=1.8.1->ydata-profiling->pandas-profiling) (4.5.0)
Requirement already satisfied: urllib3<1.27,>=1.21.1 in /usr/local/lib/python3.10/dist-packages (from requests<3,>=2.24.0->ydata-profiling->pandas-profiling) (1.26.15)
Requirement already satisfied: certifi>=2017.4.17 in /usr/local/lib/python3.10/dist-packages (from requests<3,>=2.24.0->ydata-profiling->pandas-profiling) (2022.12.7)
Requirement already satisfied: charset-normalizer~=2.0.0 in /usr/local/lib/python3.10/dist-packages (from requests<3,>=2.24.0->ydata-profiling->pandas-profiling) (2.0.12)
Requirement already satisfied: idna<4,>=2.5 in /usr/local/lib/python3.10/dist-packages (from requests<3,>=2.24.0->ydata-profiling->pandas-profiling) (3.4)
Requirement already satisfied: patsy>=0.5.2 in /usr/local/lib/python3.10/dist-packages (from statsmodels<1,>=0.13.2->ydata-profiling->pandas-profiling) (0.5.3)
Requirement already satisfied: six in /usr/local/lib/python3.10/dist-packages (from patsy>=0.5.2->statsmodels<1,>=0.13.2->ydata-profiling->pandas-profiling) (1.16.0)
Building wheels for collected packages: htmlmin
Building wheel for htmlmin (setup.py) ... done
Created wheel for htmlmin: filename=htmlmin-0.1.12-py3-none-any.whl size=27081 sha256=1c5a19b726401d2b2557964d1f33bfcecb3a043326f9551a2df45c2aae216487
Stored in directory: /root/.cache/pip/wheels/dd/91/29/a79cecb328d01739e64017b6fb9a1ab9d8cb1853098ec5966d
Successfully built htmlmin
Installing collected packages: htmlmin, typeguard, tangled-up-in-unicode, multimethod, dacite, imagehash, wordcloud, visions, phik, ydata-profiling, pandas-profiling
Attempting uninstall: wordcloud
Found existing installation: wordcloud 1.8.2.2
Uninstalling wordcloud-1.8.2.2:
Successfully uninstalled wordcloud-1.8.2.2
Successfully installed dacite-1.8.1 htmlmin-0.1.12 imagehash-4.3.1 multimethod-1.9.1 pandas-profiling-3.6.6 phik-0.12.3 tangled-up-in-unicode-0.2.0 typeguard-2.13.3 visions-0.7.5 wordcloud-1.9.2 ydata-profiling-4.2.0
import ydata_profiling as pp
profile = pp.ProfileReport(train)
profile
# create a new feature
train['hour'] = train['datetime'].dt.hour
test['hour'] = test['datetime'].dt.hour
train['hour_squared'] = train['hour'] ** 2
train['temp_humidity'] = train['temp'] * train['humidity']
test['hour_squared'] = test['hour'] ** 2
test['temp_humidity'] = test['temp'] * test['humidity']
train["season"] = train["season"].astype(dtype='category')
train["weather"] = train["weather"].astype(dtype='category')
test["season"] = test["season"].astype(dtype='category')
test["weather"] = test["weather"].astype(dtype='category')
# View are new feature
train.head()
| datetime | season | holiday | workingday | weather | temp | atemp | humidity | windspeed | casual | registered | count | hour | hour_squared | temp_humidity | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 0 | 2011-01-01 00:00:00 | 1 | 0 | 0 | 1 | 9.84 | 14.395 | 81 | 0.0 | 3 | 13 | 16 | 0 | 0 | 797.04 |
| 1 | 2011-01-01 01:00:00 | 1 | 0 | 0 | 1 | 9.02 | 13.635 | 80 | 0.0 | 8 | 32 | 40 | 1 | 1 | 721.60 |
| 2 | 2011-01-01 02:00:00 | 1 | 0 | 0 | 1 | 9.02 | 13.635 | 80 | 0.0 | 5 | 27 | 32 | 2 | 4 | 721.60 |
| 3 | 2011-01-01 03:00:00 | 1 | 0 | 0 | 1 | 9.84 | 14.395 | 75 | 0.0 | 3 | 10 | 13 | 3 | 9 | 738.00 |
| 4 | 2011-01-01 04:00:00 | 1 | 0 | 0 | 1 | 9.84 | 14.395 | 75 | 0.0 | 0 | 1 | 1 | 4 | 16 | 738.00 |
# View histogram of all features again now with the hour feature
train.hist(figsize=(25,25))
array([[<Axes: title={'center': 'datetime'}>,
<Axes: title={'center': 'holiday'}>,
<Axes: title={'center': 'workingday'}>,
<Axes: title={'center': 'temp'}>],
[<Axes: title={'center': 'atemp'}>,
<Axes: title={'center': 'humidity'}>,
<Axes: title={'center': 'windspeed'}>,
<Axes: title={'center': 'casual'}>],
[<Axes: title={'center': 'registered'}>,
<Axes: title={'center': 'count'}>,
<Axes: title={'center': 'hour'}>,
<Axes: title={'center': 'hour_squared'}>],
[<Axes: title={'center': 'temp_humidity'}>, <Axes: >, <Axes: >,
<Axes: >]], dtype=object)
profile_added_feature = pp.ProfileReport(train)
profile_added_feature
predictor_new_features = TabularPredictor(label='count',
problem_type='regression',
path='/content/drive/MyDrive',
eval_metric='root_mean_squared_error').fit(train_data = train.drop(['casual', 'registered'],axis=1),
time_limit=600,
presets='best_quality',
ag_args_fit={'num_gpus': 1}
)
Warning: path already exists! This predictor may overwrite an existing predictor! path="/content/drive/MyDrive"
Presets specified: ['best_quality']
Stack configuration (auto_stack=True): num_stack_levels=1, num_bag_folds=8, num_bag_sets=20
Beginning AutoGluon training ... Time limit = 600s
AutoGluon will save models to "/content/drive/MyDrive/"
AutoGluon Version: 0.7.0
Python Version: 3.10.11
Operating System: Linux
Platform Machine: x86_64
Platform Version: #1 SMP Sat Apr 29 09:15:28 UTC 2023
Train Data Rows: 10886
Train Data Columns: 12
Label Column: count
Preprocessing data ...
Using Feature Generators to preprocess the data ...
Fitting AutoMLPipelineFeatureGenerator...
Available Memory: 11265.07 MB
Train Data (Original) Memory Usage: 0.89 MB (0.0% of available memory)
Inferring data type of each feature based on column values. Set feature_metadata_in to manually specify special dtypes of the features.
Stage 1 Generators:
Fitting AsTypeFeatureGenerator...
Note: Converting 2 features to boolean dtype as they only contain 2 unique values.
Stage 2 Generators:
Fitting FillNaFeatureGenerator...
Stage 3 Generators:
Fitting IdentityFeatureGenerator...
Fitting CategoryFeatureGenerator...
Fitting CategoryMemoryMinimizeFeatureGenerator...
Fitting DatetimeFeatureGenerator...
Stage 4 Generators:
Fitting DropUniqueFeatureGenerator...
Types of features in original data (raw dtype, special dtypes):
('category', []) : 2 | ['season', 'weather']
('datetime', []) : 1 | ['datetime']
('float', []) : 4 | ['temp', 'atemp', 'windspeed', 'temp_humidity']
('int', []) : 5 | ['holiday', 'workingday', 'humidity', 'hour', 'hour_squared']
Types of features in processed data (raw dtype, special dtypes):
('category', []) : 2 | ['season', 'weather']
('float', []) : 4 | ['temp', 'atemp', 'windspeed', 'temp_humidity']
('int', []) : 3 | ['humidity', 'hour', 'hour_squared']
('int', ['bool']) : 2 | ['holiday', 'workingday']
('int', ['datetime_as_int']) : 5 | ['datetime', 'datetime.year', 'datetime.month', 'datetime.day', 'datetime.dayofweek']
0.2s = Fit runtime
12 features in original data used to generate 16 features in processed data.
Train Data (Processed) Memory Usage: 1.09 MB (0.0% of available memory)
Data preprocessing and feature engineering runtime = 0.21s ...
AutoGluon will gauge predictive performance using evaluation metric: 'root_mean_squared_error'
This metric's sign has been flipped to adhere to being higher_is_better. The metric score can be multiplied by -1 to get the metric value.
To change this, specify the eval_metric parameter of Predictor()
AutoGluon will fit 2 stack levels (L1 to L2) ...
Fitting 11 L1 models ...
Fitting model: KNeighborsUnif_BAG_L1 ... Training model for up to 399.76s of the 599.79s of remaining time.
-101.5462 = Validation score (-root_mean_squared_error)
0.05s = Training runtime
0.07s = Validation runtime
Fitting model: KNeighborsDist_BAG_L1 ... Training model for up to 399.58s of the 599.6s of remaining time.
-84.1251 = Validation score (-root_mean_squared_error)
0.05s = Training runtime
0.06s = Validation runtime
Fitting model: LightGBMXT_BAG_L1 ... Training model for up to 399.4s of the 599.43s of remaining time.
Fitting 8 child models (S1F1 - S1F8) | Fitting with ParallelLocalFoldFittingStrategy
-34.1655 = Validation score (-root_mean_squared_error)
125.55s = Training runtime
8.74s = Validation runtime
Fitting model: LightGBM_BAG_L1 ... Training model for up to 268.25s of the 468.28s of remaining time.
Fitting 8 child models (S1F1 - S1F8) | Fitting with ParallelLocalFoldFittingStrategy
-34.2506 = Validation score (-root_mean_squared_error)
65.46s = Training runtime
2.81s = Validation runtime
Fitting model: RandomForestMSE_BAG_L1 ... Training model for up to 198.88s of the 398.9s of remaining time.
-38.537 = Validation score (-root_mean_squared_error)
23.32s = Training runtime
0.73s = Validation runtime
Fitting model: CatBoost_BAG_L1 ... Training model for up to 173.4s of the 373.43s of remaining time.
Fitting 8 child models (S1F1 - S1F8) | Fitting with ParallelLocalFoldFittingStrategy
-39.3065 = Validation score (-root_mean_squared_error)
238.73s = Training runtime
0.08s = Validation runtime
Completed 1/20 k-fold bagging repeats ...
Fitting model: WeightedEnsemble_L2 ... Training model for up to 360.0s of the 131.25s of remaining time.
-32.3686 = Validation score (-root_mean_squared_error)
0.42s = Training runtime
0.0s = Validation runtime
Fitting 9 L2 models ...
Fitting model: LightGBMXT_BAG_L2 ... Training model for up to 130.77s of the 130.75s of remaining time.
Fitting 8 child models (S1F1 - S1F8) | Fitting with ParallelLocalFoldFittingStrategy
-30.7925 = Validation score (-root_mean_squared_error)
50.54s = Training runtime
1.28s = Validation runtime
Fitting model: LightGBM_BAG_L2 ... Training model for up to 77.55s of the 77.53s of remaining time.
Fitting 8 child models (S1F1 - S1F8) | Fitting with ParallelLocalFoldFittingStrategy
-30.804 = Validation score (-root_mean_squared_error)
39.7s = Training runtime
0.24s = Validation runtime
Fitting model: RandomForestMSE_BAG_L2 ... Training model for up to 35.5s of the 35.48s of remaining time.
-31.9367 = Validation score (-root_mean_squared_error)
35.37s = Training runtime
0.77s = Validation runtime
Completed 1/20 k-fold bagging repeats ...
Fitting model: WeightedEnsemble_L3 ... Training model for up to 360.0s of the -2.04s of remaining time.
-30.358 = Validation score (-root_mean_squared_error)
0.27s = Training runtime
0.0s = Validation runtime
AutoGluon training complete, total runtime = 602.53s ... Best model: "WeightedEnsemble_L3"
TabularPredictor saved. To load, use: predictor = TabularPredictor.load("/content/drive/MyDrive/")
predictor_new_features.fit_summary()
*** Summary of fit() ***
Estimated performance of each model:
model score_val pred_time_val fit_time pred_time_val_marginal fit_time_marginal stack_level can_infer fit_order
0 WeightedEnsemble_L3 -30.357975 14.772519 579.032353 0.000739 0.266783 3 True 11
1 LightGBMXT_BAG_L2 -30.792525 13.762315 503.695594 1.278502 50.536785 2 True 8
2 LightGBM_BAG_L2 -30.804049 12.724764 492.855033 0.240951 39.696224 2 True 9
3 RandomForestMSE_BAG_L2 -31.936662 13.252328 488.532561 0.768514 35.373752 2 True 10
4 WeightedEnsemble_L2 -32.368611 12.341320 214.797323 0.000899 0.420642 2 True 7
5 LightGBMXT_BAG_L1 -34.165530 8.738765 125.547781 8.738765 125.547781 1 True 3
6 LightGBM_BAG_L1 -34.250596 2.810006 65.464654 2.810006 65.464654 1 True 4
7 RandomForestMSE_BAG_L1 -38.537041 0.731851 23.315383 0.731851 23.315383 1 True 5
8 CatBoost_BAG_L1 -39.306461 0.076909 238.730169 0.076909 238.730169 1 True 6
9 KNeighborsDist_BAG_L1 -84.125061 0.059798 0.048863 0.059798 0.048863 1 True 2
10 KNeighborsUnif_BAG_L1 -101.546199 0.066484 0.051959 0.066484 0.051959 1 True 1
Number of models trained: 11
Types of models trained:
{'StackerEnsembleModel_LGB', 'StackerEnsembleModel_RF', 'StackerEnsembleModel_KNN', 'WeightedEnsembleModel', 'StackerEnsembleModel_CatBoost'}
Bagging used: True (with 8 folds)
Multi-layer stack-ensembling used: True (with 3 levels)
Feature Metadata (Processed):
(raw dtype, special dtypes):
('category', []) : 2 | ['season', 'weather']
('float', []) : 4 | ['temp', 'atemp', 'windspeed', 'temp_humidity']
('int', []) : 3 | ['humidity', 'hour', 'hour_squared']
('int', ['bool']) : 2 | ['holiday', 'workingday']
('int', ['datetime_as_int']) : 5 | ['datetime', 'datetime.year', 'datetime.month', 'datetime.day', 'datetime.dayofweek']
*** End of fit() summary ***
/usr/local/lib/python3.10/dist-packages/autogluon/core/utils/plots.py:138: UserWarning: AutoGluon summary plots cannot be created because bokeh is not installed. To see plots, please do: "pip install bokeh==2.0.1"
warnings.warn('AutoGluon summary plots cannot be created because bokeh is not installed. To see plots, please do: "pip install bokeh==2.0.1"')
{'model_types': {'KNeighborsUnif_BAG_L1': 'StackerEnsembleModel_KNN',
'KNeighborsDist_BAG_L1': 'StackerEnsembleModel_KNN',
'LightGBMXT_BAG_L1': 'StackerEnsembleModel_LGB',
'LightGBM_BAG_L1': 'StackerEnsembleModel_LGB',
'RandomForestMSE_BAG_L1': 'StackerEnsembleModel_RF',
'CatBoost_BAG_L1': 'StackerEnsembleModel_CatBoost',
'WeightedEnsemble_L2': 'WeightedEnsembleModel',
'LightGBMXT_BAG_L2': 'StackerEnsembleModel_LGB',
'LightGBM_BAG_L2': 'StackerEnsembleModel_LGB',
'RandomForestMSE_BAG_L2': 'StackerEnsembleModel_RF',
'WeightedEnsemble_L3': 'WeightedEnsembleModel'},
'model_performance': {'KNeighborsUnif_BAG_L1': -101.54619908446061,
'KNeighborsDist_BAG_L1': -84.12506123181602,
'LightGBMXT_BAG_L1': -34.16553046530689,
'LightGBM_BAG_L1': -34.250596295334894,
'RandomForestMSE_BAG_L1': -38.53704066266535,
'CatBoost_BAG_L1': -39.30646117897524,
'WeightedEnsemble_L2': -32.368611096612945,
'LightGBMXT_BAG_L2': -30.792524709870932,
'LightGBM_BAG_L2': -30.804049267579725,
'RandomForestMSE_BAG_L2': -31.936661977823114,
'WeightedEnsemble_L3': -30.357974839434828},
'model_best': 'WeightedEnsemble_L3',
'model_paths': {'KNeighborsUnif_BAG_L1': '/content/drive/MyDrive/models/KNeighborsUnif_BAG_L1/',
'KNeighborsDist_BAG_L1': '/content/drive/MyDrive/models/KNeighborsDist_BAG_L1/',
'LightGBMXT_BAG_L1': '/content/drive/MyDrive/models/LightGBMXT_BAG_L1/',
'LightGBM_BAG_L1': '/content/drive/MyDrive/models/LightGBM_BAG_L1/',
'RandomForestMSE_BAG_L1': '/content/drive/MyDrive/models/RandomForestMSE_BAG_L1/',
'CatBoost_BAG_L1': '/content/drive/MyDrive/models/CatBoost_BAG_L1/',
'WeightedEnsemble_L2': '/content/drive/MyDrive/models/WeightedEnsemble_L2/',
'LightGBMXT_BAG_L2': '/content/drive/MyDrive/models/LightGBMXT_BAG_L2/',
'LightGBM_BAG_L2': '/content/drive/MyDrive/models/LightGBM_BAG_L2/',
'RandomForestMSE_BAG_L2': '/content/drive/MyDrive/models/RandomForestMSE_BAG_L2/',
'WeightedEnsemble_L3': '/content/drive/MyDrive/models/WeightedEnsemble_L3/'},
'model_fit_times': {'KNeighborsUnif_BAG_L1': 0.051958560943603516,
'KNeighborsDist_BAG_L1': 0.04886293411254883,
'LightGBMXT_BAG_L1': 125.547780752182,
'LightGBM_BAG_L1': 65.4646544456482,
'RandomForestMSE_BAG_L1': 23.315383434295654,
'CatBoost_BAG_L1': 238.7301688194275,
'WeightedEnsemble_L2': 0.4206418991088867,
'LightGBMXT_BAG_L2': 50.53678488731384,
'LightGBM_BAG_L2': 39.69622445106506,
'RandomForestMSE_BAG_L2': 35.373751640319824,
'WeightedEnsemble_L3': 0.2667829990386963},
'model_pred_times': {'KNeighborsUnif_BAG_L1': 0.06648397445678711,
'KNeighborsDist_BAG_L1': 0.059798479080200195,
'LightGBMXT_BAG_L1': 8.738764762878418,
'LightGBM_BAG_L1': 2.8100063800811768,
'RandomForestMSE_BAG_L1': 0.7318508625030518,
'CatBoost_BAG_L1': 0.07690882682800293,
'WeightedEnsemble_L2': 0.0008993148803710938,
'LightGBMXT_BAG_L2': 1.2785015106201172,
'LightGBM_BAG_L2': 0.2409510612487793,
'RandomForestMSE_BAG_L2': 0.7685143947601318,
'WeightedEnsemble_L3': 0.0007390975952148438},
'num_bag_folds': 8,
'max_stack_level': 3,
'model_hyperparams': {'KNeighborsUnif_BAG_L1': {'use_orig_features': True,
'max_base_models': 25,
'max_base_models_per_type': 5,
'save_bag_folds': True,
'use_child_oof': True},
'KNeighborsDist_BAG_L1': {'use_orig_features': True,
'max_base_models': 25,
'max_base_models_per_type': 5,
'save_bag_folds': True,
'use_child_oof': True},
'LightGBMXT_BAG_L1': {'use_orig_features': True,
'max_base_models': 25,
'max_base_models_per_type': 5,
'save_bag_folds': True},
'LightGBM_BAG_L1': {'use_orig_features': True,
'max_base_models': 25,
'max_base_models_per_type': 5,
'save_bag_folds': True},
'RandomForestMSE_BAG_L1': {'use_orig_features': True,
'max_base_models': 25,
'max_base_models_per_type': 5,
'save_bag_folds': True,
'use_child_oof': True},
'CatBoost_BAG_L1': {'use_orig_features': True,
'max_base_models': 25,
'max_base_models_per_type': 5,
'save_bag_folds': True},
'WeightedEnsemble_L2': {'use_orig_features': False,
'max_base_models': 25,
'max_base_models_per_type': 5,
'save_bag_folds': True},
'LightGBMXT_BAG_L2': {'use_orig_features': True,
'max_base_models': 25,
'max_base_models_per_type': 5,
'save_bag_folds': True},
'LightGBM_BAG_L2': {'use_orig_features': True,
'max_base_models': 25,
'max_base_models_per_type': 5,
'save_bag_folds': True},
'RandomForestMSE_BAG_L2': {'use_orig_features': True,
'max_base_models': 25,
'max_base_models_per_type': 5,
'save_bag_folds': True,
'use_child_oof': True},
'WeightedEnsemble_L3': {'use_orig_features': False,
'max_base_models': 25,
'max_base_models_per_type': 5,
'save_bag_folds': True}},
'leaderboard': model score_val pred_time_val fit_time \
0 WeightedEnsemble_L3 -30.357975 14.772519 579.032353
1 LightGBMXT_BAG_L2 -30.792525 13.762315 503.695594
2 LightGBM_BAG_L2 -30.804049 12.724764 492.855033
3 RandomForestMSE_BAG_L2 -31.936662 13.252328 488.532561
4 WeightedEnsemble_L2 -32.368611 12.341320 214.797323
5 LightGBMXT_BAG_L1 -34.165530 8.738765 125.547781
6 LightGBM_BAG_L1 -34.250596 2.810006 65.464654
7 RandomForestMSE_BAG_L1 -38.537041 0.731851 23.315383
8 CatBoost_BAG_L1 -39.306461 0.076909 238.730169
9 KNeighborsDist_BAG_L1 -84.125061 0.059798 0.048863
10 KNeighborsUnif_BAG_L1 -101.546199 0.066484 0.051959
pred_time_val_marginal fit_time_marginal stack_level can_infer \
0 0.000739 0.266783 3 True
1 1.278502 50.536785 2 True
2 0.240951 39.696224 2 True
3 0.768514 35.373752 2 True
4 0.000899 0.420642 2 True
5 8.738765 125.547781 1 True
6 2.810006 65.464654 1 True
7 0.731851 23.315383 1 True
8 0.076909 238.730169 1 True
9 0.059798 0.048863 1 True
10 0.066484 0.051959 1 True
fit_order
0 11
1 8
2 9
3 10
4 7
5 3
6 4
7 5
8 6
9 2
10 1 }
predictor_new_features.leaderboard(silent=True).plot(kind="bar", x="model", y="score_val")
<Axes: xlabel='model'>
predictions = predictor_new_features.predict(test)
# Remember to set all negative values to zero
predictions[predictions<0].sum()
predictions[predictions<0]=0
# Same submitting predictions
submission_new_features = submission.copy()
submission_new_features["count"] = predictions
submission_new_features.to_csv("submission_new_features.csv", index=False)
!kaggle competitions submit -c bike-sharing-demand -f submission_new_features.csv -m "new features"
100% 188k/188k [00:02<00:00, 71.4kB/s] Successfully submitted to Bike Sharing Demand
!kaggle competitions submissions -c bike-sharing-demand | tail -n +1 | head -n 6
fileName date description status publicScore privateScore --------------------------- ------------------- ---------------------------------- -------- ----------- ------------ submission_new_features.csv 2023-05-26 16:04:20 new features complete 0.65044 0.65044 submission.csv 2023-05-26 15:53:07 first raw submission complete 1.80414 1.80414 submission_new_features.csv 2023-05-26 15:25:32 new features complete 0.71055 0.71055 submission.csv 2023-05-26 15:13:41 first raw submission complete 1.79631 1.79631
hyperparameter and hyperparameter_tune_kwargs arguments.import autogluon.core as ag
nn_options = {
'num_epochs': 10,
'learning_rate': ag.space.Real(1e-4, 1e-2, default=5e-4, log=True),
'activation': ag.space.Categorical('relu', 'softrelu', 'tanh'),
'layers': ag.space.Categorical([100], [500], [200, 100], [300, 200, 100]),
'dropout_prob': ag.space.Real(0.0, 0.5, default=0.1),
'batch_size': ag.space.Categorical(16, 32, 64)
}
gbm_options = {
'num_boost_round': 100,
'num_leaves': ag.space.Int(lower=26, upper=66, default=36),
'learning_rate': ag.space.Real(0.01, 0.2, default=0.1, log=True),
'min_data_in_leaf': ag.space.Int(lower=10, upper=100, default=20)
}
rf_options = {
'n_estimators': ag.space.Int(lower=100, upper=1000, default=200),
'max_depth': ag.space.Int(lower=5, upper=20, default=10),
'min_samples_leaf': ag.space.Int(lower=1, upper=10, default=1),
'max_features': ag.space.Categorical('sqrt', 'log2')
}
xgb_options = {
'n_estimators': ag.space.Int(lower=100, upper=1000, default=200),
'max_depth': ag.space.Int(lower=5, upper=20, default=10),
'learning_rate': ag.space.Real(0.01, 0.2, default=0.1, log=True),
'min_child_weight': ag.space.Real(1, 10, default=1),
'subsample': ag.space.Real(0.5, 1, default=1),
'colsample_bytree': ag.space.Real(0.5, 1, default=1)
}
hyperparameters = {
'GBM': gbm_options,
'NN': nn_options,
'RF': rf_options,
'XGB': xgb_options
}
search_strategy = 'auto'
hyperparameter_tune_kwargs = {
'scheduler': 'local',
'searcher': search_strategy,
'num_trials': 50,
'time_limits':1200
}
predictor_new_hpo =TabularPredictor(label="count", eval_metric="root_mean_squared_error",path='/content/drive/MyDrive',
learner_kwargs={"ignored_columns": ["casual", "registered"]})
predictor_new_hpo.fit(train_data=train, hyperparameters=hyperparameters,
hyperparameter_tune_kwargs=hyperparameter_tune_kwargs,ag_args_fit={'num_gpus': 1})
Warning: path already exists! This predictor may overwrite an existing predictor! path="/content/drive/MyDrive"
Warning: hyperparameter tuning is currently experimental and may cause the process to hang.
Beginning AutoGluon training ...
AutoGluon will save models to "/content/drive/MyDrive/"
AutoGluon Version: 0.7.0
Python Version: 3.10.11
Operating System: Linux
Platform Machine: x86_64
Platform Version: #1 SMP Sat Apr 29 09:15:28 UTC 2023
Train Data Rows: 10886
Train Data Columns: 14
Label Column: count
Preprocessing data ...
AutoGluon infers your prediction problem is: 'regression' (because dtype of label-column == int and many unique label-values observed).
Label info (max, min, mean, stddev): (977, 1, 191.57413, 181.14445)
If 'regression' is not the correct problem_type, please manually specify the problem_type parameter during predictor init (You may specify problem_type as one of: ['binary', 'multiclass', 'regression'])
Using Feature Generators to preprocess the data ...
Dropping user-specified ignored columns: ['casual', 'registered']
Fitting AutoMLPipelineFeatureGenerator...
Available Memory: 11163.02 MB
Train Data (Original) Memory Usage: 0.89 MB (0.0% of available memory)
Inferring data type of each feature based on column values. Set feature_metadata_in to manually specify special dtypes of the features.
Stage 1 Generators:
Fitting AsTypeFeatureGenerator...
Note: Converting 2 features to boolean dtype as they only contain 2 unique values.
Stage 2 Generators:
Fitting FillNaFeatureGenerator...
Stage 3 Generators:
Fitting IdentityFeatureGenerator...
Fitting CategoryFeatureGenerator...
Fitting CategoryMemoryMinimizeFeatureGenerator...
Fitting DatetimeFeatureGenerator...
Stage 4 Generators:
Fitting DropUniqueFeatureGenerator...
Types of features in original data (raw dtype, special dtypes):
('category', []) : 2 | ['season', 'weather']
('datetime', []) : 1 | ['datetime']
('float', []) : 4 | ['temp', 'atemp', 'windspeed', 'temp_humidity']
('int', []) : 5 | ['holiday', 'workingday', 'humidity', 'hour', 'hour_squared']
Types of features in processed data (raw dtype, special dtypes):
('category', []) : 2 | ['season', 'weather']
('float', []) : 4 | ['temp', 'atemp', 'windspeed', 'temp_humidity']
('int', []) : 3 | ['humidity', 'hour', 'hour_squared']
('int', ['bool']) : 2 | ['holiday', 'workingday']
('int', ['datetime_as_int']) : 5 | ['datetime', 'datetime.year', 'datetime.month', 'datetime.day', 'datetime.dayofweek']
0.1s = Fit runtime
12 features in original data used to generate 16 features in processed data.
Train Data (Processed) Memory Usage: 1.09 MB (0.0% of available memory)
Data preprocessing and feature engineering runtime = 0.18s ...
AutoGluon will gauge predictive performance using evaluation metric: 'root_mean_squared_error'
This metric's sign has been flipped to adhere to being higher_is_better. The metric score can be multiplied by -1 to get the metric value.
To change this, specify the eval_metric parameter of Predictor()
Automatically generating train/validation split with holdout_frac=0.2, Train Rows: 8708, Val Rows: 2178
WARNING: "NN" model has been deprecated in v0.4.0 and renamed to "NN_MXNET". Starting in v0.6.0, specifying "NN" or "NN_MXNET" will raise an exception. Consider instead specifying "NN_TORCH".
Fitting 4 L1 models ...
Hyperparameter tuning model: LightGBM ...
Training LightGBM/T1 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T2 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T3 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T4 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T5 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T6 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T7 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T8 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T9 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T10 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T11 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T12 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T13 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T14 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T15 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T16 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T17 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T18 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T19 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T20 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T21 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T22 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T23 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T24 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T25 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T26 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T27 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T28 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T29 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T30 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T31 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T32 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T33 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T34 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T35 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T36 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T37 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T38 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T39 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T40 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T41 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T42 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T43 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T44 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T45 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T46 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T47 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T48 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T49 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T50 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Fitted model: LightGBM/T1 ... -37.8167 = Validation score (-root_mean_squared_error) 0.45s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T2 ... -41.0061 = Validation score (-root_mean_squared_error) 0.4s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T3 ... -38.5552 = Validation score (-root_mean_squared_error) 0.45s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T4 ... -39.4938 = Validation score (-root_mean_squared_error) 0.38s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T5 ... -46.3654 = Validation score (-root_mean_squared_error) 0.44s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T6 ... -77.1386 = Validation score (-root_mean_squared_error) 0.44s = Training runtime 0.01s = Validation runtime Fitted model: LightGBM/T7 ... -40.8838 = Validation score (-root_mean_squared_error) 0.37s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T8 ... -38.1287 = Validation score (-root_mean_squared_error) 0.42s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T9 ... -39.661 = Validation score (-root_mean_squared_error) 0.39s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T10 ... -40.532 = Validation score (-root_mean_squared_error) 0.45s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T11 ... -38.7417 = Validation score (-root_mean_squared_error) 0.44s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T12 ... -90.247 = Validation score (-root_mean_squared_error) 0.42s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T13 ... -43.0258 = Validation score (-root_mean_squared_error) 0.57s = Training runtime 0.03s = Validation runtime Fitted model: LightGBM/T14 ... -51.4898 = Validation score (-root_mean_squared_error) 0.6s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T15 ... -37.2635 = Validation score (-root_mean_squared_error) 0.58s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T16 ... -52.1755 = Validation score (-root_mean_squared_error) 0.61s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T17 ... -47.6297 = Validation score (-root_mean_squared_error) 0.59s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T18 ... -61.9795 = Validation score (-root_mean_squared_error) 0.6s = Training runtime 0.04s = Validation runtime Fitted model: LightGBM/T19 ... -53.9497 = Validation score (-root_mean_squared_error) 0.61s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T20 ... -65.3336 = Validation score (-root_mean_squared_error) 0.68s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T21 ... -39.8685 = Validation score (-root_mean_squared_error) 0.6s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T22 ... -83.3789 = Validation score (-root_mean_squared_error) 0.73s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T23 ... -37.5858 = Validation score (-root_mean_squared_error) 0.6s = Training runtime 0.03s = Validation runtime Fitted model: LightGBM/T24 ... -43.8296 = Validation score (-root_mean_squared_error) 0.63s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T25 ... -77.213 = Validation score (-root_mean_squared_error) 0.61s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T26 ... -39.9297 = Validation score (-root_mean_squared_error) 0.65s = Training runtime 0.04s = Validation runtime Fitted model: LightGBM/T27 ... -38.1047 = Validation score (-root_mean_squared_error) 0.66s = Training runtime 0.03s = Validation runtime Fitted model: LightGBM/T28 ... -38.0143 = Validation score (-root_mean_squared_error) 0.6s = Training runtime 0.03s = Validation runtime Fitted model: LightGBM/T29 ... -87.9706 = Validation score (-root_mean_squared_error) 0.63s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T30 ... -47.6968 = Validation score (-root_mean_squared_error) 0.74s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T31 ... -73.2598 = Validation score (-root_mean_squared_error) 0.6s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T32 ... -54.7039 = Validation score (-root_mean_squared_error) 0.5s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T33 ... -42.3602 = Validation score (-root_mean_squared_error) 0.43s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T34 ... -45.3921 = Validation score (-root_mean_squared_error) 0.41s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T35 ... -56.0761 = Validation score (-root_mean_squared_error) 0.43s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T36 ... -38.5077 = Validation score (-root_mean_squared_error) 0.45s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T37 ... -39.4251 = Validation score (-root_mean_squared_error) 0.45s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T38 ... -46.535 = Validation score (-root_mean_squared_error) 0.42s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T39 ... -52.2024 = Validation score (-root_mean_squared_error) 0.42s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T40 ... -40.46 = Validation score (-root_mean_squared_error) 0.49s = Training runtime 0.03s = Validation runtime Fitted model: LightGBM/T41 ... -45.5526 = Validation score (-root_mean_squared_error) 0.72s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T42 ... -38.5441 = Validation score (-root_mean_squared_error) 0.55s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T43 ... -70.595 = Validation score (-root_mean_squared_error) 0.66s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T44 ... -56.9317 = Validation score (-root_mean_squared_error) 0.65s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T45 ... -82.2785 = Validation score (-root_mean_squared_error) 0.6s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T46 ... -37.6546 = Validation score (-root_mean_squared_error) 0.67s = Training runtime 0.03s = Validation runtime Fitted model: LightGBM/T47 ... -38.7366 = Validation score (-root_mean_squared_error) 0.66s = Training runtime 0.03s = Validation runtime Fitted model: LightGBM/T48 ... -70.5688 = Validation score (-root_mean_squared_error) 0.63s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T49 ... -44.4598 = Validation score (-root_mean_squared_error) 0.87s = Training runtime 0.03s = Validation runtime Fitted model: LightGBM/T50 ... -48.294 = Validation score (-root_mean_squared_error) 0.76s = Training runtime 0.03s = Validation runtime Hyperparameter tuning model: RandomForest ...
Fitted model: RandomForest/T1 ... -63.3475 = Validation score (-root_mean_squared_error) 2.81s = Training runtime 0.44s = Validation runtime Fitted model: RandomForest/T2 ... -54.9792 = Validation score (-root_mean_squared_error) 4.15s = Training runtime 0.44s = Validation runtime Fitted model: RandomForest/T3 ... -81.4429 = Validation score (-root_mean_squared_error) 4.45s = Training runtime 0.61s = Validation runtime Fitted model: RandomForest/T4 ... -55.1975 = Validation score (-root_mean_squared_error) 7.18s = Training runtime 0.91s = Validation runtime Fitted model: RandomForest/T5 ... -72.8553 = Validation score (-root_mean_squared_error) 3.28s = Training runtime 0.52s = Validation runtime Fitted model: RandomForest/T6 ... -50.0103 = Validation score (-root_mean_squared_error) 9.42s = Training runtime 0.95s = Validation runtime Fitted model: RandomForest/T7 ... -60.8037 = Validation score (-root_mean_squared_error) 8.04s = Training runtime 0.91s = Validation runtime Fitted model: RandomForest/T8 ... -67.4084 = Validation score (-root_mean_squared_error) 6.02s = Training runtime 1.22s = Validation runtime Fitted model: RandomForest/T9 ... -72.0469 = Validation score (-root_mean_squared_error) 9.66s = Training runtime 1.19s = Validation runtime Fitted model: RandomForest/T10 ... -63.6004 = Validation score (-root_mean_squared_error) 6.73s = Training runtime 0.97s = Validation runtime Fitted model: RandomForest/T11 ... -81.5159 = Validation score (-root_mean_squared_error) 6.71s = Training runtime 1.11s = Validation runtime Fitted model: RandomForest/T12 ... -50.5664 = Validation score (-root_mean_squared_error) 7.8s = Training runtime 0.81s = Validation runtime Fitted model: RandomForest/T13 ... -107.4268 = Validation score (-root_mean_squared_error) 3.2s = Training runtime 0.66s = Validation runtime Fitted model: RandomForest/T14 ... -52.9556 = Validation score (-root_mean_squared_error) 2.02s = Training runtime 0.31s = Validation runtime Fitted model: RandomForest/T15 ... -82.0361 = Validation score (-root_mean_squared_error) 1.5s = Training runtime 0.25s = Validation runtime Fitted model: RandomForest/T16 ... -44.3424 = Validation score (-root_mean_squared_error) 3.35s = Training runtime 0.47s = Validation runtime Fitted model: RandomForest/T17 ... -72.6923 = Validation score (-root_mean_squared_error) 6.9s = Training runtime 1.0s = Validation runtime Fitted model: RandomForest/T18 ... -49.874 = Validation score (-root_mean_squared_error) 5.6s = Training runtime 0.54s = Validation runtime Fitted model: RandomForest/T19 ... -74.0611 = Validation score (-root_mean_squared_error) 2.04s = Training runtime 0.32s = Validation runtime Fitted model: RandomForest/T20 ... -100.254 = Validation score (-root_mean_squared_error) 0.68s = Training runtime 0.13s = Validation runtime Fitted model: RandomForest/T21 ... -81.9047 = Validation score (-root_mean_squared_error) 5.08s = Training runtime 1.1s = Validation runtime Fitted model: RandomForest/T22 ... -43.7505 = Validation score (-root_mean_squared_error) 11.35s = Training runtime 1.04s = Validation runtime Fitted model: RandomForest/T23 ... -81.7012 = Validation score (-root_mean_squared_error) 1.03s = Training runtime 0.17s = Validation runtime Fitted model: RandomForest/T24 ... -56.1326 = Validation score (-root_mean_squared_error) 1.06s = Training runtime 0.17s = Validation runtime Fitted model: RandomForest/T25 ... -50.5928 = Validation score (-root_mean_squared_error) 9.73s = Training runtime 1.46s = Validation runtime Fitted model: RandomForest/T26 ... -72.9689 = Validation score (-root_mean_squared_error) 3.51s = Training runtime 0.45s = Validation runtime Fitted model: RandomForest/T27 ... -54.7102 = Validation score (-root_mean_squared_error) 9.84s = Training runtime 0.98s = Validation runtime Fitted model: RandomForest/T28 ... -107.3937 = Validation score (-root_mean_squared_error) 0.99s = Training runtime 0.22s = Validation runtime Fitted model: RandomForest/T29 ... -81.5445 = Validation score (-root_mean_squared_error) 4.05s = Training runtime 1.08s = Validation runtime Fitted model: RandomForest/T30 ... -107.4466 = Validation score (-root_mean_squared_error) 2.8s = Training runtime 0.51s = Validation runtime Fitted model: RandomForest/T31 ... -90.9826 = Validation score (-root_mean_squared_error) 6.33s = Training runtime 1.04s = Validation runtime Fitted model: RandomForest/T32 ... -45.6082 = Validation score (-root_mean_squared_error) 4.24s = Training runtime 0.41s = Validation runtime Fitted model: RandomForest/T33 ... -48.6148 = Validation score (-root_mean_squared_error) 5.57s = Training runtime 0.97s = Validation runtime Fitted model: RandomForest/T34 ... -57.7044 = Validation score (-root_mean_squared_error) 10.51s = Training runtime 1.18s = Validation runtime Fitted model: RandomForest/T35 ... -90.8552 = Validation score (-root_mean_squared_error) 4.35s = Training runtime 0.6s = Validation runtime Fitted model: RandomForest/T36 ... -81.4208 = Validation score (-root_mean_squared_error) 3.82s = Training runtime 0.64s = Validation runtime Fitted model: RandomForest/T37 ... -52.8985 = Validation score (-root_mean_squared_error) 2.35s = Training runtime 0.34s = Validation runtime Fitted model: RandomForest/T38 ... -91.2946 = Validation score (-root_mean_squared_error) 1.43s = Training runtime 0.2s = Validation runtime Fitted model: RandomForest/T39 ... -100.3826 = Validation score (-root_mean_squared_error) 1.17s = Training runtime 0.18s = Validation runtime Fitted model: RandomForest/T40 ... -51.7635 = Validation score (-root_mean_squared_error) 6.21s = Training runtime 0.69s = Validation runtime Fitted model: RandomForest/T41 ... -107.462 = Validation score (-root_mean_squared_error) 1.72s = Training runtime 0.32s = Validation runtime Fitted model: RandomForest/T42 ... -59.2748 = Validation score (-root_mean_squared_error) 3.71s = Training runtime 0.41s = Validation runtime Fitted model: RandomForest/T43 ... -59.1341 = Validation score (-root_mean_squared_error) 9.46s = Training runtime 1.38s = Validation runtime Fitted model: RandomForest/T44 ... -91.0201 = Validation score (-root_mean_squared_error) 8.93s = Training runtime 1.23s = Validation runtime Fitted model: RandomForest/T45 ... -58.8103 = Validation score (-root_mean_squared_error) 4.03s = Training runtime 0.44s = Validation runtime Fitted model: RandomForest/T46 ... -81.9131 = Validation score (-root_mean_squared_error) 2.74s = Training runtime 0.48s = Validation runtime Fitted model: RandomForest/T47 ... -90.7796 = Validation score (-root_mean_squared_error) 2.0s = Training runtime 0.37s = Validation runtime Fitted model: RandomForest/T48 ... -44.2115 = Validation score (-root_mean_squared_error) 6.1s = Training runtime 0.72s = Validation runtime Fitted model: RandomForest/T49 ... -81.3941 = Validation score (-root_mean_squared_error) 5.33s = Training runtime 0.74s = Validation runtime Fitted model: RandomForest/T50 ... -59.3632 = Validation score (-root_mean_squared_error) 5.92s = Training runtime 0.69s = Validation runtime Hyperparameter tuning model: XGBoost ...
Fitted model: XGBoost/T1 ... -37.4681 = Validation score (-root_mean_squared_error) 3.23s = Training runtime 0.08s = Validation runtime Fitted model: XGBoost/T2 ... -36.9565 = Validation score (-root_mean_squared_error) 2.02s = Training runtime 0.12s = Validation runtime Fitted model: XGBoost/T3 ... -35.9155 = Validation score (-root_mean_squared_error) 5.79s = Training runtime 0.31s = Validation runtime Fitted model: XGBoost/T4 ... -36.4122 = Validation score (-root_mean_squared_error) 3.87s = Training runtime 0.07s = Validation runtime Fitted model: XGBoost/T5 ... -37.1364 = Validation score (-root_mean_squared_error) 3.32s = Training runtime 0.25s = Validation runtime Fitted model: XGBoost/T6 ... -38.0911 = Validation score (-root_mean_squared_error) 0.93s = Training runtime 0.04s = Validation runtime Fitted model: XGBoost/T7 ... -35.3707 = Validation score (-root_mean_squared_error) 10.72s = Training runtime 0.45s = Validation runtime Fitted model: XGBoost/T8 ... -38.8001 = Validation score (-root_mean_squared_error) 2.61s = Training runtime 0.15s = Validation runtime Fitted model: XGBoost/T9 ... -36.2512 = Validation score (-root_mean_squared_error) 2.38s = Training runtime 0.04s = Validation runtime Fitted model: XGBoost/T10 ... -42.4198 = Validation score (-root_mean_squared_error) 0.74s = Training runtime 0.02s = Validation runtime Fitted model: XGBoost/T11 ... -35.451 = Validation score (-root_mean_squared_error) 6.37s = Training runtime 0.24s = Validation runtime Fitted model: XGBoost/T12 ... -37.7584 = Validation score (-root_mean_squared_error) 0.71s = Training runtime 0.03s = Validation runtime Fitted model: XGBoost/T13 ... -39.5717 = Validation score (-root_mean_squared_error) 6.72s = Training runtime 0.42s = Validation runtime Fitted model: XGBoost/T14 ... -36.8078 = Validation score (-root_mean_squared_error) 5.31s = Training runtime 0.61s = Validation runtime Fitted model: XGBoost/T15 ... -39.3478 = Validation score (-root_mean_squared_error) 0.93s = Training runtime 0.04s = Validation runtime Fitted model: XGBoost/T16 ... -35.8393 = Validation score (-root_mean_squared_error) 6.33s = Training runtime 0.11s = Validation runtime Fitted model: XGBoost/T17 ... -40.88 = Validation score (-root_mean_squared_error) 4.28s = Training runtime 0.06s = Validation runtime Fitted model: XGBoost/T18 ... -37.4719 = Validation score (-root_mean_squared_error) 1.92s = Training runtime 0.06s = Validation runtime Fitted model: XGBoost/T19 ... -38.5073 = Validation score (-root_mean_squared_error) 4.82s = Training runtime 0.1s = Validation runtime Fitted model: XGBoost/T20 ... -41.398 = Validation score (-root_mean_squared_error) 0.69s = Training runtime 0.05s = Validation runtime Fitted model: XGBoost/T21 ... -36.0892 = Validation score (-root_mean_squared_error) 4.23s = Training runtime 0.07s = Validation runtime Fitted model: XGBoost/T22 ... -36.4793 = Validation score (-root_mean_squared_error) 3.34s = Training runtime 0.03s = Validation runtime Fitted model: XGBoost/T23 ... -38.491 = Validation score (-root_mean_squared_error) 3.69s = Training runtime 0.03s = Validation runtime Fitted model: XGBoost/T24 ... -36.2488 = Validation score (-root_mean_squared_error) 4.21s = Training runtime 0.06s = Validation runtime Fitted model: XGBoost/T25 ... -36.2884 = Validation score (-root_mean_squared_error) 3.6s = Training runtime 0.19s = Validation runtime Fitted model: XGBoost/T26 ... -38.8951 = Validation score (-root_mean_squared_error) 1.24s = Training runtime 0.07s = Validation runtime Fitted model: XGBoost/T27 ... -40.8541 = Validation score (-root_mean_squared_error) 6.06s = Training runtime 0.3s = Validation runtime Fitted model: XGBoost/T28 ... -37.3839 = Validation score (-root_mean_squared_error) 12.65s = Training runtime 0.66s = Validation runtime Fitted model: XGBoost/T29 ... -36.3733 = Validation score (-root_mean_squared_error) 8.11s = Training runtime 0.1s = Validation runtime Fitted model: XGBoost/T30 ... -37.8355 = Validation score (-root_mean_squared_error) 3.92s = Training runtime 0.02s = Validation runtime Fitted model: XGBoost/T31 ... -36.4582 = Validation score (-root_mean_squared_error) 3.47s = Training runtime 0.03s = Validation runtime Fitted model: XGBoost/T32 ... -38.0476 = Validation score (-root_mean_squared_error) 1.76s = Training runtime 0.06s = Validation runtime Fitted model: XGBoost/T33 ... -35.7094 = Validation score (-root_mean_squared_error) 4.63s = Training runtime 0.22s = Validation runtime Fitted model: XGBoost/T34 ... -38.2136 = Validation score (-root_mean_squared_error) 5.28s = Training runtime 0.44s = Validation runtime Fitted model: XGBoost/T35 ... -39.0263 = Validation score (-root_mean_squared_error) 0.71s = Training runtime 0.05s = Validation runtime Fitted model: XGBoost/T36 ... -40.0526 = Validation score (-root_mean_squared_error) 1.59s = Training runtime 0.14s = Validation runtime Fitted model: XGBoost/T37 ... -50.0044 = Validation score (-root_mean_squared_error) 1.64s = Training runtime 0.08s = Validation runtime Fitted model: XGBoost/T38 ... -38.5685 = Validation score (-root_mean_squared_error) 1.77s = Training runtime 0.02s = Validation runtime Fitted model: XGBoost/T39 ... -36.118 = Validation score (-root_mean_squared_error) 11.34s = Training runtime 0.34s = Validation runtime Fitted model: XGBoost/T40 ... -37.7967 = Validation score (-root_mean_squared_error) 1.69s = Training runtime 0.09s = Validation runtime Fitted model: XGBoost/T41 ... -37.1309 = Validation score (-root_mean_squared_error) 6.69s = Training runtime 0.89s = Validation runtime Fitted model: XGBoost/T42 ... -37.7767 = Validation score (-root_mean_squared_error) 3.8s = Training runtime 0.21s = Validation runtime Fitted model: XGBoost/T43 ... -38.802 = Validation score (-root_mean_squared_error) 4.0s = Training runtime 0.07s = Validation runtime Fitted model: XGBoost/T44 ... -40.8647 = Validation score (-root_mean_squared_error) 6.22s = Training runtime 0.26s = Validation runtime Fitted model: XGBoost/T45 ... -38.8218 = Validation score (-root_mean_squared_error) 1.91s = Training runtime 0.1s = Validation runtime Fitted model: XGBoost/T46 ... -36.9605 = Validation score (-root_mean_squared_error) 4.23s = Training runtime 0.1s = Validation runtime Fitted model: XGBoost/T47 ... -40.966 = Validation score (-root_mean_squared_error) 8.02s = Training runtime 0.03s = Validation runtime Fitted model: XGBoost/T48 ... -35.3816 = Validation score (-root_mean_squared_error) 4.59s = Training runtime 0.18s = Validation runtime Fitted model: XGBoost/T49 ... -38.0782 = Validation score (-root_mean_squared_error) 3.57s = Training runtime 0.13s = Validation runtime Fitted model: XGBoost/T50 ... -37.8414 = Validation score (-root_mean_squared_error) 2.48s = Training runtime 0.18s = Validation runtime Hyperparameter tuning model: NeuralNetMXNet ...
WARNING: TabularNeuralNetMxnetModel (alias "NN" & "NN_MXNET") has been deprecated in v0.4.0.
Starting in v0.6.0, calling TabularNeuralNetMxnetModel will raise an exception.
Consider instead using TabularNeuralNetTorchModel via "NN_TORCH".
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
No model was trained during hyperparameter tuning NeuralNetMXNet... Skipping this model.
Fitting model: WeightedEnsemble_L2 ...
-34.825 = Validation score (-root_mean_squared_error)
0.54s = Training runtime
0.0s = Validation runtime
AutoGluon training complete, total runtime = 617.12s ... Best model: "WeightedEnsemble_L2"
TabularPredictor saved. To load, use: predictor = TabularPredictor.load("/content/drive/MyDrive/")
<autogluon.tabular.predictor.predictor.TabularPredictor at 0x7f2d9b587c40>
predictor_new_hpo.fit_summary()
*** Summary of fit() ***
Estimated performance of each model:
model score_val pred_time_val fit_time pred_time_val_marginal fit_time_marginal stack_level can_infer fit_order
0 WeightedEnsemble_L2 -34.824982 1.056576 30.332862 0.000614 0.539521 2 True 151
1 XGBoost/T7 -35.370728 0.452888 10.724385 0.452888 10.724385 1 True 107
2 XGBoost/T48 -35.381646 0.176290 4.593190 0.176290 4.593190 1 True 148
3 XGBoost/T11 -35.451044 0.239347 6.370124 0.239347 6.370124 1 True 111
4 XGBoost/T33 -35.709377 0.224408 4.632751 0.224408 4.632751 1 True 133
5 XGBoost/T16 -35.839331 0.110362 6.325320 0.110362 6.325320 1 True 116
6 XGBoost/T3 -35.915535 0.313405 5.786649 0.313405 5.786649 1 True 103
7 XGBoost/T21 -36.089177 0.070236 4.232656 0.070236 4.232656 1 True 121
8 XGBoost/T39 -36.118050 0.340521 11.335424 0.340521 11.335424 1 True 139
9 XGBoost/T24 -36.248805 0.059884 4.212552 0.059884 4.212552 1 True 124
10 XGBoost/T9 -36.251221 0.043677 2.378149 0.043677 2.378149 1 True 109
11 XGBoost/T25 -36.288421 0.187271 3.598096 0.187271 3.598096 1 True 125
12 XGBoost/T29 -36.373330 0.095480 8.113579 0.095480 8.113579 1 True 129
13 XGBoost/T4 -36.412193 0.065241 3.866899 0.065241 3.866899 1 True 104
14 XGBoost/T31 -36.458183 0.029854 3.468751 0.029854 3.468751 1 True 131
15 XGBoost/T22 -36.479282 0.033405 3.339128 0.033405 3.339128 1 True 122
16 XGBoost/T14 -36.807766 0.611797 5.307306 0.611797 5.307306 1 True 114
17 XGBoost/T2 -36.956541 0.121290 2.023232 0.121290 2.023232 1 True 102
18 XGBoost/T46 -36.960534 0.101666 4.227032 0.101666 4.227032 1 True 146
19 XGBoost/T41 -37.130904 0.890992 6.689139 0.890992 6.689139 1 True 141
20 XGBoost/T5 -37.136354 0.246589 3.324223 0.246589 3.324223 1 True 105
21 LightGBM/T15 -37.263497 0.022708 0.579387 0.022708 0.579387 1 True 15
22 XGBoost/T28 -37.383946 0.655132 12.651747 0.655132 12.651747 1 True 128
23 XGBoost/T1 -37.468141 0.075688 3.228221 0.075688 3.228221 1 True 101
24 XGBoost/T18 -37.471853 0.059421 1.923841 0.059421 1.923841 1 True 118
25 LightGBM/T23 -37.585843 0.027061 0.600414 0.027061 0.600414 1 True 23
26 LightGBM/T46 -37.654642 0.029801 0.671565 0.029801 0.671565 1 True 46
27 XGBoost/T12 -37.758433 0.028249 0.707264 0.028249 0.707264 1 True 112
28 XGBoost/T42 -37.776750 0.206234 3.804925 0.206234 3.804925 1 True 142
29 XGBoost/T40 -37.796716 0.088802 1.685659 0.088802 1.685659 1 True 140
30 LightGBM/T1 -37.816666 0.023468 0.446192 0.023468 0.446192 1 True 1
31 XGBoost/T30 -37.835511 0.021637 3.919741 0.021637 3.919741 1 True 130
32 XGBoost/T50 -37.841439 0.181163 2.475245 0.181163 2.475245 1 True 150
33 LightGBM/T28 -38.014309 0.027307 0.600521 0.027307 0.600521 1 True 28
34 XGBoost/T32 -38.047589 0.060682 1.764359 0.060682 1.764359 1 True 132
35 XGBoost/T49 -38.078244 0.129801 3.566132 0.129801 3.566132 1 True 149
36 XGBoost/T6 -38.091108 0.044971 0.928680 0.044971 0.928680 1 True 106
37 LightGBM/T27 -38.104693 0.029222 0.658231 0.029222 0.658231 1 True 27
38 LightGBM/T8 -38.128702 0.020808 0.423729 0.020808 0.423729 1 True 8
39 XGBoost/T34 -38.213639 0.435712 5.284575 0.435712 5.284575 1 True 134
40 XGBoost/T23 -38.491040 0.025978 3.690598 0.025978 3.690598 1 True 123
41 XGBoost/T19 -38.507321 0.101976 4.817701 0.101976 4.817701 1 True 119
42 LightGBM/T36 -38.507708 0.020317 0.453047 0.020317 0.453047 1 True 36
43 LightGBM/T42 -38.544139 0.021121 0.550908 0.021121 0.550908 1 True 42
44 LightGBM/T3 -38.555223 0.021869 0.454845 0.021869 0.454845 1 True 3
45 XGBoost/T38 -38.568496 0.021435 1.773480 0.021435 1.773480 1 True 138
46 LightGBM/T47 -38.736605 0.026198 0.657072 0.026198 0.657072 1 True 47
47 LightGBM/T11 -38.741687 0.024696 0.441579 0.024696 0.441579 1 True 11
48 XGBoost/T8 -38.800104 0.152189 2.613728 0.152189 2.613728 1 True 108
49 XGBoost/T43 -38.801975 0.072067 4.002642 0.072067 4.002642 1 True 143
50 XGBoost/T45 -38.821842 0.102957 1.910820 0.102957 1.910820 1 True 145
51 XGBoost/T26 -38.895094 0.069905 1.239523 0.069905 1.239523 1 True 126
52 XGBoost/T35 -39.026268 0.046584 0.713829 0.046584 0.713829 1 True 135
53 XGBoost/T15 -39.347837 0.038802 0.934012 0.038802 0.934012 1 True 115
54 LightGBM/T37 -39.425136 0.023851 0.447129 0.023851 0.447129 1 True 37
55 LightGBM/T4 -39.493831 0.019394 0.381384 0.019394 0.381384 1 True 4
56 XGBoost/T13 -39.571730 0.418748 6.723606 0.418748 6.723606 1 True 113
57 LightGBM/T9 -39.661014 0.022290 0.391613 0.022290 0.391613 1 True 9
58 LightGBM/T21 -39.868515 0.024232 0.596259 0.024232 0.596259 1 True 21
59 LightGBM/T26 -39.929661 0.037447 0.654309 0.037447 0.654309 1 True 26
60 XGBoost/T36 -40.052609 0.143620 1.587037 0.143620 1.587037 1 True 136
61 LightGBM/T40 -40.459954 0.025961 0.486267 0.025961 0.486267 1 True 40
62 LightGBM/T10 -40.532010 0.023538 0.453880 0.023538 0.453880 1 True 10
63 XGBoost/T27 -40.854139 0.303287 6.059901 0.303287 6.059901 1 True 127
64 XGBoost/T44 -40.864666 0.262022 6.222008 0.262022 6.222008 1 True 144
65 XGBoost/T17 -40.880006 0.055315 4.281242 0.055315 4.281242 1 True 117
66 LightGBM/T7 -40.883784 0.019127 0.374458 0.019127 0.374458 1 True 7
67 XGBoost/T47 -40.965963 0.026499 8.022725 0.026499 8.022725 1 True 147
68 LightGBM/T2 -41.006101 0.023755 0.396028 0.023755 0.396028 1 True 2
69 XGBoost/T20 -41.398028 0.052158 0.693346 0.052158 0.693346 1 True 120
70 LightGBM/T33 -42.360195 0.020921 0.433368 0.020921 0.433368 1 True 33
71 XGBoost/T10 -42.419802 0.021459 0.741501 0.021459 0.741501 1 True 110
72 LightGBM/T13 -43.025781 0.026221 0.574811 0.026221 0.574811 1 True 13
73 RandomForest/T22 -43.750535 1.038235 11.347149 1.038235 11.347149 1 True 72
74 LightGBM/T24 -43.829644 0.022077 0.629149 0.022077 0.629149 1 True 24
75 RandomForest/T48 -44.211532 0.716263 6.098989 0.716263 6.098989 1 True 98
76 RandomForest/T16 -44.342407 0.467508 3.347264 0.467508 3.347264 1 True 66
77 LightGBM/T49 -44.459821 0.030795 0.870638 0.030795 0.870638 1 True 49
78 LightGBM/T34 -45.392114 0.016412 0.405848 0.016412 0.405848 1 True 34
79 LightGBM/T41 -45.552581 0.023408 0.724155 0.023408 0.724155 1 True 41
80 RandomForest/T32 -45.608187 0.409798 4.236100 0.409798 4.236100 1 True 82
81 LightGBM/T5 -46.365369 0.022289 0.440435 0.022289 0.440435 1 True 5
82 LightGBM/T38 -46.535047 0.021260 0.423496 0.021260 0.423496 1 True 38
83 LightGBM/T17 -47.629739 0.023263 0.589628 0.023263 0.589628 1 True 17
84 LightGBM/T30 -47.696756 0.023558 0.744116 0.023558 0.744116 1 True 30
85 LightGBM/T50 -48.294043 0.025258 0.758012 0.025258 0.758012 1 True 50
86 RandomForest/T33 -48.614789 0.966132 5.572222 0.966132 5.572222 1 True 83
87 RandomForest/T18 -49.874045 0.540950 5.597194 0.540950 5.597194 1 True 68
88 XGBoost/T37 -50.004433 0.077359 1.643365 0.077359 1.643365 1 True 137
89 RandomForest/T6 -50.010288 0.945707 9.416214 0.945707 9.416214 1 True 56
90 RandomForest/T12 -50.566419 0.808460 7.799920 0.808460 7.799920 1 True 62
91 RandomForest/T25 -50.592786 1.455045 9.729553 1.455045 9.729553 1 True 75
92 LightGBM/T14 -51.489828 0.019886 0.599747 0.019886 0.599747 1 True 14
93 RandomForest/T40 -51.763540 0.687140 6.212815 0.687140 6.212815 1 True 90
94 LightGBM/T16 -52.175496 0.021645 0.609347 0.021645 0.609347 1 True 16
95 LightGBM/T39 -52.202361 0.017766 0.415694 0.017766 0.415694 1 True 39
96 RandomForest/T37 -52.898544 0.340930 2.348939 0.340930 2.348939 1 True 87
97 RandomForest/T14 -52.955623 0.313673 2.023550 0.313673 2.023550 1 True 64
98 LightGBM/T19 -53.949696 0.021359 0.614751 0.021359 0.614751 1 True 19
99 LightGBM/T32 -54.703898 0.017093 0.502866 0.017093 0.502866 1 True 32
100 RandomForest/T27 -54.710235 0.983769 9.837587 0.983769 9.837587 1 True 77
101 RandomForest/T2 -54.979158 0.438312 4.154291 0.438312 4.154291 1 True 52
102 RandomForest/T4 -55.197496 0.907008 7.179701 0.907008 7.179701 1 True 54
103 LightGBM/T35 -56.076120 0.021067 0.431810 0.021067 0.431810 1 True 35
104 RandomForest/T24 -56.132567 0.168750 1.055594 0.168750 1.055594 1 True 74
105 LightGBM/T44 -56.931671 0.021111 0.645902 0.021111 0.645902 1 True 44
106 RandomForest/T34 -57.704400 1.177185 10.514863 1.177185 10.514863 1 True 84
107 RandomForest/T45 -58.810295 0.444645 4.032302 0.444645 4.032302 1 True 95
108 RandomForest/T43 -59.134057 1.377281 9.461160 1.377281 9.461160 1 True 93
109 RandomForest/T42 -59.274821 0.409539 3.712663 0.409539 3.712663 1 True 92
110 RandomForest/T50 -59.363171 0.690313 5.921797 0.690313 5.921797 1 True 100
111 RandomForest/T7 -60.803711 0.913175 8.044477 0.913175 8.044477 1 True 57
112 LightGBM/T18 -61.979550 0.036449 0.604167 0.036449 0.604167 1 True 18
113 RandomForest/T1 -63.347492 0.440350 2.807476 0.440350 2.807476 1 True 51
114 RandomForest/T10 -63.600367 0.973822 6.730473 0.973822 6.730473 1 True 60
115 LightGBM/T20 -65.333621 0.022691 0.682662 0.022691 0.682662 1 True 20
116 RandomForest/T8 -67.408396 1.223523 6.017071 1.223523 6.017071 1 True 58
117 LightGBM/T48 -70.568807 0.020375 0.630915 0.020375 0.630915 1 True 48
118 LightGBM/T43 -70.595025 0.024018 0.656403 0.024018 0.656403 1 True 43
119 RandomForest/T9 -72.046912 1.186031 9.660468 1.186031 9.660468 1 True 59
120 RandomForest/T17 -72.692288 0.996732 6.898207 0.996732 6.898207 1 True 67
121 RandomForest/T5 -72.855293 0.518963 3.277503 0.518963 3.277503 1 True 55
122 RandomForest/T26 -72.968923 0.449905 3.505838 0.449905 3.505838 1 True 76
123 LightGBM/T31 -73.259763 0.019998 0.598683 0.019998 0.598683 1 True 31
124 RandomForest/T19 -74.061139 0.321230 2.044910 0.321230 2.044910 1 True 69
125 LightGBM/T6 -77.138630 0.014323 0.442043 0.014323 0.442043 1 True 6
126 LightGBM/T25 -77.213020 0.023218 0.614967 0.023218 0.614967 1 True 25
127 RandomForest/T49 -81.394114 0.737066 5.332803 0.737066 5.332803 1 True 99
128 RandomForest/T36 -81.420843 0.643681 3.816972 0.643681 3.816972 1 True 86
129 RandomForest/T3 -81.442898 0.610449 4.449816 0.610449 4.449816 1 True 53
130 RandomForest/T11 -81.515891 1.113979 6.709577 1.113979 6.709577 1 True 61
131 RandomForest/T29 -81.544474 1.077253 4.054541 1.077253 4.054541 1 True 79
132 RandomForest/T23 -81.701206 0.167436 1.026792 0.167436 1.026792 1 True 73
133 RandomForest/T21 -81.904746 1.102099 5.084297 1.102099 5.084297 1 True 71
134 RandomForest/T46 -81.913083 0.475453 2.735275 0.475453 2.735275 1 True 96
135 RandomForest/T15 -82.036052 0.253716 1.504421 0.253716 1.504421 1 True 65
136 LightGBM/T45 -82.278523 0.019782 0.596736 0.019782 0.596736 1 True 45
137 LightGBM/T22 -83.378925 0.023512 0.730594 0.023512 0.730594 1 True 22
138 LightGBM/T29 -87.970586 0.019060 0.632013 0.019060 0.632013 1 True 29
139 LightGBM/T12 -90.247001 0.018169 0.424536 0.018169 0.424536 1 True 12
140 RandomForest/T47 -90.779582 0.368580 1.998972 0.368580 1.998972 1 True 97
141 RandomForest/T35 -90.855238 0.602679 4.345323 0.602679 4.345323 1 True 85
142 RandomForest/T31 -90.982586 1.043301 6.326571 1.043301 6.326571 1 True 81
143 RandomForest/T44 -91.020121 1.234207 8.929496 1.234207 8.929496 1 True 94
144 RandomForest/T38 -91.294614 0.198331 1.429831 0.198331 1.429831 1 True 88
145 RandomForest/T20 -100.254017 0.133208 0.676770 0.133208 0.676770 1 True 70
146 RandomForest/T39 -100.382603 0.175341 1.166851 0.175341 1.166851 1 True 89
147 RandomForest/T28 -107.393671 0.215009 0.986697 0.215009 0.986697 1 True 78
148 RandomForest/T13 -107.426838 0.657933 3.204717 0.657933 3.204717 1 True 63
149 RandomForest/T30 -107.446557 0.510362 2.800887 0.510362 2.800887 1 True 80
150 RandomForest/T41 -107.462009 0.321278 1.722017 0.321278 1.722017 1 True 91
Number of models trained: 151
Types of models trained:
{'XGBoostModel', 'LGBModel', 'WeightedEnsembleModel', 'RFModel'}
Bagging used: False
Multi-layer stack-ensembling used: False
Feature Metadata (Processed):
(raw dtype, special dtypes):
('category', []) : 2 | ['season', 'weather']
('float', []) : 4 | ['temp', 'atemp', 'windspeed', 'temp_humidity']
('int', []) : 3 | ['humidity', 'hour', 'hour_squared']
('int', ['bool']) : 2 | ['holiday', 'workingday']
('int', ['datetime_as_int']) : 5 | ['datetime', 'datetime.year', 'datetime.month', 'datetime.day', 'datetime.dayofweek']
*** End of fit() summary ***
/usr/local/lib/python3.10/dist-packages/autogluon/core/utils/plots.py:138: UserWarning: AutoGluon summary plots cannot be created because bokeh is not installed. To see plots, please do: "pip install bokeh==2.0.1"
warnings.warn('AutoGluon summary plots cannot be created because bokeh is not installed. To see plots, please do: "pip install bokeh==2.0.1"')
{'model_types': {'LightGBM/T1': 'LGBModel',
'LightGBM/T2': 'LGBModel',
'LightGBM/T3': 'LGBModel',
'LightGBM/T4': 'LGBModel',
'LightGBM/T5': 'LGBModel',
'LightGBM/T6': 'LGBModel',
'LightGBM/T7': 'LGBModel',
'LightGBM/T8': 'LGBModel',
'LightGBM/T9': 'LGBModel',
'LightGBM/T10': 'LGBModel',
'LightGBM/T11': 'LGBModel',
'LightGBM/T12': 'LGBModel',
'LightGBM/T13': 'LGBModel',
'LightGBM/T14': 'LGBModel',
'LightGBM/T15': 'LGBModel',
'LightGBM/T16': 'LGBModel',
'LightGBM/T17': 'LGBModel',
'LightGBM/T18': 'LGBModel',
'LightGBM/T19': 'LGBModel',
'LightGBM/T20': 'LGBModel',
'LightGBM/T21': 'LGBModel',
'LightGBM/T22': 'LGBModel',
'LightGBM/T23': 'LGBModel',
'LightGBM/T24': 'LGBModel',
'LightGBM/T25': 'LGBModel',
'LightGBM/T26': 'LGBModel',
'LightGBM/T27': 'LGBModel',
'LightGBM/T28': 'LGBModel',
'LightGBM/T29': 'LGBModel',
'LightGBM/T30': 'LGBModel',
'LightGBM/T31': 'LGBModel',
'LightGBM/T32': 'LGBModel',
'LightGBM/T33': 'LGBModel',
'LightGBM/T34': 'LGBModel',
'LightGBM/T35': 'LGBModel',
'LightGBM/T36': 'LGBModel',
'LightGBM/T37': 'LGBModel',
'LightGBM/T38': 'LGBModel',
'LightGBM/T39': 'LGBModel',
'LightGBM/T40': 'LGBModel',
'LightGBM/T41': 'LGBModel',
'LightGBM/T42': 'LGBModel',
'LightGBM/T43': 'LGBModel',
'LightGBM/T44': 'LGBModel',
'LightGBM/T45': 'LGBModel',
'LightGBM/T46': 'LGBModel',
'LightGBM/T47': 'LGBModel',
'LightGBM/T48': 'LGBModel',
'LightGBM/T49': 'LGBModel',
'LightGBM/T50': 'LGBModel',
'RandomForest/T1': 'RFModel',
'RandomForest/T2': 'RFModel',
'RandomForest/T3': 'RFModel',
'RandomForest/T4': 'RFModel',
'RandomForest/T5': 'RFModel',
'RandomForest/T6': 'RFModel',
'RandomForest/T7': 'RFModel',
'RandomForest/T8': 'RFModel',
'RandomForest/T9': 'RFModel',
'RandomForest/T10': 'RFModel',
'RandomForest/T11': 'RFModel',
'RandomForest/T12': 'RFModel',
'RandomForest/T13': 'RFModel',
'RandomForest/T14': 'RFModel',
'RandomForest/T15': 'RFModel',
'RandomForest/T16': 'RFModel',
'RandomForest/T17': 'RFModel',
'RandomForest/T18': 'RFModel',
'RandomForest/T19': 'RFModel',
'RandomForest/T20': 'RFModel',
'RandomForest/T21': 'RFModel',
'RandomForest/T22': 'RFModel',
'RandomForest/T23': 'RFModel',
'RandomForest/T24': 'RFModel',
'RandomForest/T25': 'RFModel',
'RandomForest/T26': 'RFModel',
'RandomForest/T27': 'RFModel',
'RandomForest/T28': 'RFModel',
'RandomForest/T29': 'RFModel',
'RandomForest/T30': 'RFModel',
'RandomForest/T31': 'RFModel',
'RandomForest/T32': 'RFModel',
'RandomForest/T33': 'RFModel',
'RandomForest/T34': 'RFModel',
'RandomForest/T35': 'RFModel',
'RandomForest/T36': 'RFModel',
'RandomForest/T37': 'RFModel',
'RandomForest/T38': 'RFModel',
'RandomForest/T39': 'RFModel',
'RandomForest/T40': 'RFModel',
'RandomForest/T41': 'RFModel',
'RandomForest/T42': 'RFModel',
'RandomForest/T43': 'RFModel',
'RandomForest/T44': 'RFModel',
'RandomForest/T45': 'RFModel',
'RandomForest/T46': 'RFModel',
'RandomForest/T47': 'RFModel',
'RandomForest/T48': 'RFModel',
'RandomForest/T49': 'RFModel',
'RandomForest/T50': 'RFModel',
'XGBoost/T1': 'XGBoostModel',
'XGBoost/T2': 'XGBoostModel',
'XGBoost/T3': 'XGBoostModel',
'XGBoost/T4': 'XGBoostModel',
'XGBoost/T5': 'XGBoostModel',
'XGBoost/T6': 'XGBoostModel',
'XGBoost/T7': 'XGBoostModel',
'XGBoost/T8': 'XGBoostModel',
'XGBoost/T9': 'XGBoostModel',
'XGBoost/T10': 'XGBoostModel',
'XGBoost/T11': 'XGBoostModel',
'XGBoost/T12': 'XGBoostModel',
'XGBoost/T13': 'XGBoostModel',
'XGBoost/T14': 'XGBoostModel',
'XGBoost/T15': 'XGBoostModel',
'XGBoost/T16': 'XGBoostModel',
'XGBoost/T17': 'XGBoostModel',
'XGBoost/T18': 'XGBoostModel',
'XGBoost/T19': 'XGBoostModel',
'XGBoost/T20': 'XGBoostModel',
'XGBoost/T21': 'XGBoostModel',
'XGBoost/T22': 'XGBoostModel',
'XGBoost/T23': 'XGBoostModel',
'XGBoost/T24': 'XGBoostModel',
'XGBoost/T25': 'XGBoostModel',
'XGBoost/T26': 'XGBoostModel',
'XGBoost/T27': 'XGBoostModel',
'XGBoost/T28': 'XGBoostModel',
'XGBoost/T29': 'XGBoostModel',
'XGBoost/T30': 'XGBoostModel',
'XGBoost/T31': 'XGBoostModel',
'XGBoost/T32': 'XGBoostModel',
'XGBoost/T33': 'XGBoostModel',
'XGBoost/T34': 'XGBoostModel',
'XGBoost/T35': 'XGBoostModel',
'XGBoost/T36': 'XGBoostModel',
'XGBoost/T37': 'XGBoostModel',
'XGBoost/T38': 'XGBoostModel',
'XGBoost/T39': 'XGBoostModel',
'XGBoost/T40': 'XGBoostModel',
'XGBoost/T41': 'XGBoostModel',
'XGBoost/T42': 'XGBoostModel',
'XGBoost/T43': 'XGBoostModel',
'XGBoost/T44': 'XGBoostModel',
'XGBoost/T45': 'XGBoostModel',
'XGBoost/T46': 'XGBoostModel',
'XGBoost/T47': 'XGBoostModel',
'XGBoost/T48': 'XGBoostModel',
'XGBoost/T49': 'XGBoostModel',
'XGBoost/T50': 'XGBoostModel',
'WeightedEnsemble_L2': 'WeightedEnsembleModel'},
'model_performance': {'LightGBM/T1': -37.816665631954244,
'LightGBM/T2': -41.00610106910873,
'LightGBM/T3': -38.55522265768774,
'LightGBM/T4': -39.49383137984814,
'LightGBM/T5': -46.36536928364148,
'LightGBM/T6': -77.1386295836808,
'LightGBM/T7': -40.88378418574723,
'LightGBM/T8': -38.12870161678881,
'LightGBM/T9': -39.66101425230072,
'LightGBM/T10': -40.53201022840687,
'LightGBM/T11': -38.741686631441425,
'LightGBM/T12': -90.24700052974642,
'LightGBM/T13': -43.02578098393248,
'LightGBM/T14': -51.48982757733833,
'LightGBM/T15': -37.26349683273912,
'LightGBM/T16': -52.175496123605754,
'LightGBM/T17': -47.62973867464604,
'LightGBM/T18': -61.9795496651608,
'LightGBM/T19': -53.94969639701796,
'LightGBM/T20': -65.33362095228514,
'LightGBM/T21': -39.8685152358332,
'LightGBM/T22': -83.37892540977087,
'LightGBM/T23': -37.58584338112727,
'LightGBM/T24': -43.8296436874419,
'LightGBM/T25': -77.21302011200737,
'LightGBM/T26': -39.929661380656384,
'LightGBM/T27': -38.10469279989536,
'LightGBM/T28': -38.01430929870956,
'LightGBM/T29': -87.97058611583202,
'LightGBM/T30': -47.69675598499414,
'LightGBM/T31': -73.25976263680822,
'LightGBM/T32': -54.70389798041698,
'LightGBM/T33': -42.36019474949604,
'LightGBM/T34': -45.39211386538975,
'LightGBM/T35': -56.076120312963305,
'LightGBM/T36': -38.50770813330047,
'LightGBM/T37': -39.42513637268889,
'LightGBM/T38': -46.53504664918384,
'LightGBM/T39': -52.20236141811973,
'LightGBM/T40': -40.45995364965635,
'LightGBM/T41': -45.552580999998575,
'LightGBM/T42': -38.54413922717227,
'LightGBM/T43': -70.59502482358404,
'LightGBM/T44': -56.931671028285194,
'LightGBM/T45': -82.27852255781701,
'LightGBM/T46': -37.65464229950021,
'LightGBM/T47': -38.73660535175479,
'LightGBM/T48': -70.56880661682696,
'LightGBM/T49': -44.45982071536047,
'LightGBM/T50': -48.29404334925084,
'RandomForest/T1': -63.34749219502903,
'RandomForest/T2': -54.97915807611765,
'RandomForest/T3': -81.44289848363293,
'RandomForest/T4': -55.1974960624872,
'RandomForest/T5': -72.85529339062978,
'RandomForest/T6': -50.01028842723625,
'RandomForest/T7': -60.80371091839002,
'RandomForest/T8': -67.40839604833891,
'RandomForest/T9': -72.0469123177313,
'RandomForest/T10': -63.60036667444705,
'RandomForest/T11': -81.51589128604316,
'RandomForest/T12': -50.56641868347309,
'RandomForest/T13': -107.42683823943017,
'RandomForest/T14': -52.95562289266679,
'RandomForest/T15': -82.0360523063029,
'RandomForest/T16': -44.34240711198643,
'RandomForest/T17': -72.6922878048895,
'RandomForest/T18': -49.87404549020824,
'RandomForest/T19': -74.06113929827654,
'RandomForest/T20': -100.25401673615194,
'RandomForest/T21': -81.90474610171765,
'RandomForest/T22': -43.75053548785061,
'RandomForest/T23': -81.70120632503262,
'RandomForest/T24': -56.132566854606665,
'RandomForest/T25': -50.59278563089286,
'RandomForest/T26': -72.96892264365167,
'RandomForest/T27': -54.71023510751187,
'RandomForest/T28': -107.39367086634141,
'RandomForest/T29': -81.54447367980762,
'RandomForest/T30': -107.44655672084481,
'RandomForest/T31': -90.98258580105733,
'RandomForest/T32': -45.608186759736085,
'RandomForest/T33': -48.61478923219133,
'RandomForest/T34': -57.70439990118621,
'RandomForest/T35': -90.85523761825702,
'RandomForest/T36': -81.42084314335702,
'RandomForest/T37': -52.89854392907124,
'RandomForest/T38': -91.29461376565986,
'RandomForest/T39': -100.3826027187973,
'RandomForest/T40': -51.76354016319063,
'RandomForest/T41': -107.46200946357142,
'RandomForest/T42': -59.27482071081472,
'RandomForest/T43': -59.1340573554175,
'RandomForest/T44': -91.0201213962395,
'RandomForest/T45': -58.81029519147745,
'RandomForest/T46': -81.91308327953355,
'RandomForest/T47': -90.77958190083888,
'RandomForest/T48': -44.21153224730815,
'RandomForest/T49': -81.39411376892662,
'RandomForest/T50': -59.36317078033903,
'XGBoost/T1': -37.46814055813217,
'XGBoost/T2': -36.956541285645834,
'XGBoost/T3': -35.91553513322565,
'XGBoost/T4': -36.41219253221283,
'XGBoost/T5': -37.13635449412413,
'XGBoost/T6': -38.0911081461406,
'XGBoost/T7': -35.370728168613205,
'XGBoost/T8': -38.80010438634216,
'XGBoost/T9': -36.25122120366462,
'XGBoost/T10': -42.419802224091264,
'XGBoost/T11': -35.45104408003755,
'XGBoost/T12': -37.758433232661076,
'XGBoost/T13': -39.57173042458874,
'XGBoost/T14': -36.807766357831035,
'XGBoost/T15': -39.34783684427042,
'XGBoost/T16': -35.839330856075904,
'XGBoost/T17': -40.88000620202415,
'XGBoost/T18': -37.471853010576346,
'XGBoost/T19': -38.507321422175664,
'XGBoost/T20': -41.39802833279148,
'XGBoost/T21': -36.08917714050745,
'XGBoost/T22': -36.479281750987674,
'XGBoost/T23': -38.491039756090785,
'XGBoost/T24': -36.248804777413234,
'XGBoost/T25': -36.28842103106481,
'XGBoost/T26': -38.89509422977031,
'XGBoost/T27': -40.85413877265607,
'XGBoost/T28': -37.38394571446976,
'XGBoost/T29': -36.37333042269951,
'XGBoost/T30': -37.835511037212875,
'XGBoost/T31': -36.45818285403173,
'XGBoost/T32': -38.047588940160864,
'XGBoost/T33': -35.709377392504045,
'XGBoost/T34': -38.213638527588216,
'XGBoost/T35': -39.026268334183044,
'XGBoost/T36': -40.05260895243657,
'XGBoost/T37': -50.004433199754956,
'XGBoost/T38': -38.56849565354197,
'XGBoost/T39': -36.118049930031205,
'XGBoost/T40': -37.79671555831119,
'XGBoost/T41': -37.13090379210739,
'XGBoost/T42': -37.77674969442705,
'XGBoost/T43': -38.80197521391477,
'XGBoost/T44': -40.86466578134525,
'XGBoost/T45': -38.82184197426117,
'XGBoost/T46': -36.96053422684872,
'XGBoost/T47': -40.96596324693844,
'XGBoost/T48': -35.38164645539363,
'XGBoost/T49': -38.07824354447093,
'XGBoost/T50': -37.84143943841117,
'WeightedEnsemble_L2': -34.82498213886387},
'model_best': 'WeightedEnsemble_L2',
'model_paths': {'LightGBM/T1': '/content/drive/MyDrive/models/LightGBM/T1/',
'LightGBM/T2': '/content/drive/MyDrive/models/LightGBM/T2/',
'LightGBM/T3': '/content/drive/MyDrive/models/LightGBM/T3/',
'LightGBM/T4': '/content/drive/MyDrive/models/LightGBM/T4/',
'LightGBM/T5': '/content/drive/MyDrive/models/LightGBM/T5/',
'LightGBM/T6': '/content/drive/MyDrive/models/LightGBM/T6/',
'LightGBM/T7': '/content/drive/MyDrive/models/LightGBM/T7/',
'LightGBM/T8': '/content/drive/MyDrive/models/LightGBM/T8/',
'LightGBM/T9': '/content/drive/MyDrive/models/LightGBM/T9/',
'LightGBM/T10': '/content/drive/MyDrive/models/LightGBM/T10/',
'LightGBM/T11': '/content/drive/MyDrive/models/LightGBM/T11/',
'LightGBM/T12': '/content/drive/MyDrive/models/LightGBM/T12/',
'LightGBM/T13': '/content/drive/MyDrive/models/LightGBM/T13/',
'LightGBM/T14': '/content/drive/MyDrive/models/LightGBM/T14/',
'LightGBM/T15': '/content/drive/MyDrive/models/LightGBM/T15/',
'LightGBM/T16': '/content/drive/MyDrive/models/LightGBM/T16/',
'LightGBM/T17': '/content/drive/MyDrive/models/LightGBM/T17/',
'LightGBM/T18': '/content/drive/MyDrive/models/LightGBM/T18/',
'LightGBM/T19': '/content/drive/MyDrive/models/LightGBM/T19/',
'LightGBM/T20': '/content/drive/MyDrive/models/LightGBM/T20/',
'LightGBM/T21': '/content/drive/MyDrive/models/LightGBM/T21/',
'LightGBM/T22': '/content/drive/MyDrive/models/LightGBM/T22/',
'LightGBM/T23': '/content/drive/MyDrive/models/LightGBM/T23/',
'LightGBM/T24': '/content/drive/MyDrive/models/LightGBM/T24/',
'LightGBM/T25': '/content/drive/MyDrive/models/LightGBM/T25/',
'LightGBM/T26': '/content/drive/MyDrive/models/LightGBM/T26/',
'LightGBM/T27': '/content/drive/MyDrive/models/LightGBM/T27/',
'LightGBM/T28': '/content/drive/MyDrive/models/LightGBM/T28/',
'LightGBM/T29': '/content/drive/MyDrive/models/LightGBM/T29/',
'LightGBM/T30': '/content/drive/MyDrive/models/LightGBM/T30/',
'LightGBM/T31': '/content/drive/MyDrive/models/LightGBM/T31/',
'LightGBM/T32': '/content/drive/MyDrive/models/LightGBM/T32/',
'LightGBM/T33': '/content/drive/MyDrive/models/LightGBM/T33/',
'LightGBM/T34': '/content/drive/MyDrive/models/LightGBM/T34/',
'LightGBM/T35': '/content/drive/MyDrive/models/LightGBM/T35/',
'LightGBM/T36': '/content/drive/MyDrive/models/LightGBM/T36/',
'LightGBM/T37': '/content/drive/MyDrive/models/LightGBM/T37/',
'LightGBM/T38': '/content/drive/MyDrive/models/LightGBM/T38/',
'LightGBM/T39': '/content/drive/MyDrive/models/LightGBM/T39/',
'LightGBM/T40': '/content/drive/MyDrive/models/LightGBM/T40/',
'LightGBM/T41': '/content/drive/MyDrive/models/LightGBM/T41/',
'LightGBM/T42': '/content/drive/MyDrive/models/LightGBM/T42/',
'LightGBM/T43': '/content/drive/MyDrive/models/LightGBM/T43/',
'LightGBM/T44': '/content/drive/MyDrive/models/LightGBM/T44/',
'LightGBM/T45': '/content/drive/MyDrive/models/LightGBM/T45/',
'LightGBM/T46': '/content/drive/MyDrive/models/LightGBM/T46/',
'LightGBM/T47': '/content/drive/MyDrive/models/LightGBM/T47/',
'LightGBM/T48': '/content/drive/MyDrive/models/LightGBM/T48/',
'LightGBM/T49': '/content/drive/MyDrive/models/LightGBM/T49/',
'LightGBM/T50': '/content/drive/MyDrive/models/LightGBM/T50/',
'RandomForest/T1': '/content/drive/MyDrive/models/RandomForest/T1/',
'RandomForest/T2': '/content/drive/MyDrive/models/RandomForest/T2/',
'RandomForest/T3': '/content/drive/MyDrive/models/RandomForest/T3/',
'RandomForest/T4': '/content/drive/MyDrive/models/RandomForest/T4/',
'RandomForest/T5': '/content/drive/MyDrive/models/RandomForest/T5/',
'RandomForest/T6': '/content/drive/MyDrive/models/RandomForest/T6/',
'RandomForest/T7': '/content/drive/MyDrive/models/RandomForest/T7/',
'RandomForest/T8': '/content/drive/MyDrive/models/RandomForest/T8/',
'RandomForest/T9': '/content/drive/MyDrive/models/RandomForest/T9/',
'RandomForest/T10': '/content/drive/MyDrive/models/RandomForest/T10/',
'RandomForest/T11': '/content/drive/MyDrive/models/RandomForest/T11/',
'RandomForest/T12': '/content/drive/MyDrive/models/RandomForest/T12/',
'RandomForest/T13': '/content/drive/MyDrive/models/RandomForest/T13/',
'RandomForest/T14': '/content/drive/MyDrive/models/RandomForest/T14/',
'RandomForest/T15': '/content/drive/MyDrive/models/RandomForest/T15/',
'RandomForest/T16': '/content/drive/MyDrive/models/RandomForest/T16/',
'RandomForest/T17': '/content/drive/MyDrive/models/RandomForest/T17/',
'RandomForest/T18': '/content/drive/MyDrive/models/RandomForest/T18/',
'RandomForest/T19': '/content/drive/MyDrive/models/RandomForest/T19/',
'RandomForest/T20': '/content/drive/MyDrive/models/RandomForest/T20/',
'RandomForest/T21': '/content/drive/MyDrive/models/RandomForest/T21/',
'RandomForest/T22': '/content/drive/MyDrive/models/RandomForest/T22/',
'RandomForest/T23': '/content/drive/MyDrive/models/RandomForest/T23/',
'RandomForest/T24': '/content/drive/MyDrive/models/RandomForest/T24/',
'RandomForest/T25': '/content/drive/MyDrive/models/RandomForest/T25/',
'RandomForest/T26': '/content/drive/MyDrive/models/RandomForest/T26/',
'RandomForest/T27': '/content/drive/MyDrive/models/RandomForest/T27/',
'RandomForest/T28': '/content/drive/MyDrive/models/RandomForest/T28/',
'RandomForest/T29': '/content/drive/MyDrive/models/RandomForest/T29/',
'RandomForest/T30': '/content/drive/MyDrive/models/RandomForest/T30/',
'RandomForest/T31': '/content/drive/MyDrive/models/RandomForest/T31/',
'RandomForest/T32': '/content/drive/MyDrive/models/RandomForest/T32/',
'RandomForest/T33': '/content/drive/MyDrive/models/RandomForest/T33/',
'RandomForest/T34': '/content/drive/MyDrive/models/RandomForest/T34/',
'RandomForest/T35': '/content/drive/MyDrive/models/RandomForest/T35/',
'RandomForest/T36': '/content/drive/MyDrive/models/RandomForest/T36/',
'RandomForest/T37': '/content/drive/MyDrive/models/RandomForest/T37/',
'RandomForest/T38': '/content/drive/MyDrive/models/RandomForest/T38/',
'RandomForest/T39': '/content/drive/MyDrive/models/RandomForest/T39/',
'RandomForest/T40': '/content/drive/MyDrive/models/RandomForest/T40/',
'RandomForest/T41': '/content/drive/MyDrive/models/RandomForest/T41/',
'RandomForest/T42': '/content/drive/MyDrive/models/RandomForest/T42/',
'RandomForest/T43': '/content/drive/MyDrive/models/RandomForest/T43/',
'RandomForest/T44': '/content/drive/MyDrive/models/RandomForest/T44/',
'RandomForest/T45': '/content/drive/MyDrive/models/RandomForest/T45/',
'RandomForest/T46': '/content/drive/MyDrive/models/RandomForest/T46/',
'RandomForest/T47': '/content/drive/MyDrive/models/RandomForest/T47/',
'RandomForest/T48': '/content/drive/MyDrive/models/RandomForest/T48/',
'RandomForest/T49': '/content/drive/MyDrive/models/RandomForest/T49/',
'RandomForest/T50': '/content/drive/MyDrive/models/RandomForest/T50/',
'XGBoost/T1': '/content/drive/MyDrive/models/XGBoost/T1/',
'XGBoost/T2': '/content/drive/MyDrive/models/XGBoost/T2/',
'XGBoost/T3': '/content/drive/MyDrive/models/XGBoost/T3/',
'XGBoost/T4': '/content/drive/MyDrive/models/XGBoost/T4/',
'XGBoost/T5': '/content/drive/MyDrive/models/XGBoost/T5/',
'XGBoost/T6': '/content/drive/MyDrive/models/XGBoost/T6/',
'XGBoost/T7': '/content/drive/MyDrive/models/XGBoost/T7/',
'XGBoost/T8': '/content/drive/MyDrive/models/XGBoost/T8/',
'XGBoost/T9': '/content/drive/MyDrive/models/XGBoost/T9/',
'XGBoost/T10': '/content/drive/MyDrive/models/XGBoost/T10/',
'XGBoost/T11': '/content/drive/MyDrive/models/XGBoost/T11/',
'XGBoost/T12': '/content/drive/MyDrive/models/XGBoost/T12/',
'XGBoost/T13': '/content/drive/MyDrive/models/XGBoost/T13/',
'XGBoost/T14': '/content/drive/MyDrive/models/XGBoost/T14/',
'XGBoost/T15': '/content/drive/MyDrive/models/XGBoost/T15/',
'XGBoost/T16': '/content/drive/MyDrive/models/XGBoost/T16/',
'XGBoost/T17': '/content/drive/MyDrive/models/XGBoost/T17/',
'XGBoost/T18': '/content/drive/MyDrive/models/XGBoost/T18/',
'XGBoost/T19': '/content/drive/MyDrive/models/XGBoost/T19/',
'XGBoost/T20': '/content/drive/MyDrive/models/XGBoost/T20/',
'XGBoost/T21': '/content/drive/MyDrive/models/XGBoost/T21/',
'XGBoost/T22': '/content/drive/MyDrive/models/XGBoost/T22/',
'XGBoost/T23': '/content/drive/MyDrive/models/XGBoost/T23/',
'XGBoost/T24': '/content/drive/MyDrive/models/XGBoost/T24/',
'XGBoost/T25': '/content/drive/MyDrive/models/XGBoost/T25/',
'XGBoost/T26': '/content/drive/MyDrive/models/XGBoost/T26/',
'XGBoost/T27': '/content/drive/MyDrive/models/XGBoost/T27/',
'XGBoost/T28': '/content/drive/MyDrive/models/XGBoost/T28/',
'XGBoost/T29': '/content/drive/MyDrive/models/XGBoost/T29/',
'XGBoost/T30': '/content/drive/MyDrive/models/XGBoost/T30/',
'XGBoost/T31': '/content/drive/MyDrive/models/XGBoost/T31/',
'XGBoost/T32': '/content/drive/MyDrive/models/XGBoost/T32/',
'XGBoost/T33': '/content/drive/MyDrive/models/XGBoost/T33/',
'XGBoost/T34': '/content/drive/MyDrive/models/XGBoost/T34/',
'XGBoost/T35': '/content/drive/MyDrive/models/XGBoost/T35/',
'XGBoost/T36': '/content/drive/MyDrive/models/XGBoost/T36/',
'XGBoost/T37': '/content/drive/MyDrive/models/XGBoost/T37/',
'XGBoost/T38': '/content/drive/MyDrive/models/XGBoost/T38/',
'XGBoost/T39': '/content/drive/MyDrive/models/XGBoost/T39/',
'XGBoost/T40': '/content/drive/MyDrive/models/XGBoost/T40/',
'XGBoost/T41': '/content/drive/MyDrive/models/XGBoost/T41/',
'XGBoost/T42': '/content/drive/MyDrive/models/XGBoost/T42/',
'XGBoost/T43': '/content/drive/MyDrive/models/XGBoost/T43/',
'XGBoost/T44': '/content/drive/MyDrive/models/XGBoost/T44/',
'XGBoost/T45': '/content/drive/MyDrive/models/XGBoost/T45/',
'XGBoost/T46': '/content/drive/MyDrive/models/XGBoost/T46/',
'XGBoost/T47': '/content/drive/MyDrive/models/XGBoost/T47/',
'XGBoost/T48': '/content/drive/MyDrive/models/XGBoost/T48/',
'XGBoost/T49': '/content/drive/MyDrive/models/XGBoost/T49/',
'XGBoost/T50': '/content/drive/MyDrive/models/XGBoost/T50/',
'WeightedEnsemble_L2': '/content/drive/MyDrive/models/WeightedEnsemble_L2/'},
'model_fit_times': {'LightGBM/T1': 0.44619178771972656,
'LightGBM/T2': 0.3960275650024414,
'LightGBM/T3': 0.4548451900482178,
'LightGBM/T4': 0.38138365745544434,
'LightGBM/T5': 0.44043493270874023,
'LightGBM/T6': 0.4420430660247803,
'LightGBM/T7': 0.37445759773254395,
'LightGBM/T8': 0.42372941970825195,
'LightGBM/T9': 0.39161252975463867,
'LightGBM/T10': 0.45387959480285645,
'LightGBM/T11': 0.44157862663269043,
'LightGBM/T12': 0.42453575134277344,
'LightGBM/T13': 0.5748114585876465,
'LightGBM/T14': 0.5997474193572998,
'LightGBM/T15': 0.5793874263763428,
'LightGBM/T16': 0.6093466281890869,
'LightGBM/T17': 0.5896284580230713,
'LightGBM/T18': 0.6041669845581055,
'LightGBM/T19': 0.6147511005401611,
'LightGBM/T20': 0.6826622486114502,
'LightGBM/T21': 0.596259355545044,
'LightGBM/T22': 0.7305939197540283,
'LightGBM/T23': 0.6004137992858887,
'LightGBM/T24': 0.6291494369506836,
'LightGBM/T25': 0.6149673461914062,
'LightGBM/T26': 0.6543092727661133,
'LightGBM/T27': 0.658231258392334,
'LightGBM/T28': 0.6005210876464844,
'LightGBM/T29': 0.6320133209228516,
'LightGBM/T30': 0.7441155910491943,
'LightGBM/T31': 0.5986828804016113,
'LightGBM/T32': 0.5028660297393799,
'LightGBM/T33': 0.4333682060241699,
'LightGBM/T34': 0.40584802627563477,
'LightGBM/T35': 0.4318103790283203,
'LightGBM/T36': 0.4530465602874756,
'LightGBM/T37': 0.4471292495727539,
'LightGBM/T38': 0.4234964847564697,
'LightGBM/T39': 0.415693998336792,
'LightGBM/T40': 0.48626708984375,
'LightGBM/T41': 0.7241554260253906,
'LightGBM/T42': 0.550908088684082,
'LightGBM/T43': 0.656402587890625,
'LightGBM/T44': 0.645902156829834,
'LightGBM/T45': 0.5967357158660889,
'LightGBM/T46': 0.6715648174285889,
'LightGBM/T47': 0.6570718288421631,
'LightGBM/T48': 0.6309146881103516,
'LightGBM/T49': 0.8706381320953369,
'LightGBM/T50': 0.7580122947692871,
'RandomForest/T1': 2.807476043701172,
'RandomForest/T2': 4.1542909145355225,
'RandomForest/T3': 4.449815511703491,
'RandomForest/T4': 7.17970085144043,
'RandomForest/T5': 3.277503490447998,
'RandomForest/T6': 9.416213750839233,
'RandomForest/T7': 8.044477224349976,
'RandomForest/T8': 6.017071485519409,
'RandomForest/T9': 9.660467863082886,
'RandomForest/T10': 6.730472803115845,
'RandomForest/T11': 6.709576606750488,
'RandomForest/T12': 7.799920320510864,
'RandomForest/T13': 3.204716682434082,
'RandomForest/T14': 2.023550271987915,
'RandomForest/T15': 1.5044207572937012,
'RandomForest/T16': 3.347264289855957,
'RandomForest/T17': 6.898207187652588,
'RandomForest/T18': 5.597193717956543,
'RandomForest/T19': 2.044909715652466,
'RandomForest/T20': 0.6767699718475342,
'RandomForest/T21': 5.084296941757202,
'RandomForest/T22': 11.347149133682251,
'RandomForest/T23': 1.026792287826538,
'RandomForest/T24': 1.0555939674377441,
'RandomForest/T25': 9.729552507400513,
'RandomForest/T26': 3.505837917327881,
'RandomForest/T27': 9.837586641311646,
'RandomForest/T28': 0.9866974353790283,
'RandomForest/T29': 4.054540634155273,
'RandomForest/T30': 2.800886869430542,
'RandomForest/T31': 6.326571226119995,
'RandomForest/T32': 4.2360999584198,
'RandomForest/T33': 5.5722222328186035,
'RandomForest/T34': 10.514862537384033,
'RandomForest/T35': 4.345322847366333,
'RandomForest/T36': 3.816972255706787,
'RandomForest/T37': 2.3489387035369873,
'RandomForest/T38': 1.429830551147461,
'RandomForest/T39': 1.1668508052825928,
'RandomForest/T40': 6.212814569473267,
'RandomForest/T41': 1.7220170497894287,
'RandomForest/T42': 3.712662935256958,
'RandomForest/T43': 9.46116018295288,
'RandomForest/T44': 8.929495573043823,
'RandomForest/T45': 4.032301902770996,
'RandomForest/T46': 2.7352752685546875,
'RandomForest/T47': 1.998971939086914,
'RandomForest/T48': 6.098988771438599,
'RandomForest/T49': 5.3328025341033936,
'RandomForest/T50': 5.921796560287476,
'XGBoost/T1': 3.2282214164733887,
'XGBoost/T2': 2.0232319831848145,
'XGBoost/T3': 5.786648511886597,
'XGBoost/T4': 3.866899013519287,
'XGBoost/T5': 3.324223279953003,
'XGBoost/T6': 0.9286799430847168,
'XGBoost/T7': 10.724384784698486,
'XGBoost/T8': 2.6137280464172363,
'XGBoost/T9': 2.3781492710113525,
'XGBoost/T10': 0.7415010929107666,
'XGBoost/T11': 6.370123624801636,
'XGBoost/T12': 0.7072644233703613,
'XGBoost/T13': 6.7236058712005615,
'XGBoost/T14': 5.307306289672852,
'XGBoost/T15': 0.9340119361877441,
'XGBoost/T16': 6.32532000541687,
'XGBoost/T17': 4.2812418937683105,
'XGBoost/T18': 1.9238414764404297,
'XGBoost/T19': 4.817700624465942,
'XGBoost/T20': 0.6933462619781494,
'XGBoost/T21': 4.2326555252075195,
'XGBoost/T22': 3.339127779006958,
'XGBoost/T23': 3.6905977725982666,
'XGBoost/T24': 4.212552070617676,
'XGBoost/T25': 3.5980958938598633,
'XGBoost/T26': 1.2395226955413818,
'XGBoost/T27': 6.059900999069214,
'XGBoost/T28': 12.65174651145935,
'XGBoost/T29': 8.11357855796814,
'XGBoost/T30': 3.919740915298462,
'XGBoost/T31': 3.4687509536743164,
'XGBoost/T32': 1.7643587589263916,
'XGBoost/T33': 4.632750511169434,
'XGBoost/T34': 5.284574747085571,
'XGBoost/T35': 0.7138292789459229,
'XGBoost/T36': 1.5870366096496582,
'XGBoost/T37': 1.6433651447296143,
'XGBoost/T38': 1.7734801769256592,
'XGBoost/T39': 11.335424423217773,
'XGBoost/T40': 1.685659408569336,
'XGBoost/T41': 6.689138650894165,
'XGBoost/T42': 3.804924964904785,
'XGBoost/T43': 4.002641916275024,
'XGBoost/T44': 6.222007989883423,
'XGBoost/T45': 1.9108200073242188,
'XGBoost/T46': 4.22703218460083,
'XGBoost/T47': 8.022724866867065,
'XGBoost/T48': 4.5931901931762695,
'XGBoost/T49': 3.566131830215454,
'XGBoost/T50': 2.475245237350464,
'WeightedEnsemble_L2': 0.5395207405090332},
'model_pred_times': {'LightGBM/T1': 0.0234682559967041,
'LightGBM/T2': 0.02375483512878418,
'LightGBM/T3': 0.02186894416809082,
'LightGBM/T4': 0.0193939208984375,
'LightGBM/T5': 0.02228856086730957,
'LightGBM/T6': 0.014322996139526367,
'LightGBM/T7': 0.019127368927001953,
'LightGBM/T8': 0.02080821990966797,
'LightGBM/T9': 0.022289514541625977,
'LightGBM/T10': 0.023537635803222656,
'LightGBM/T11': 0.02469611167907715,
'LightGBM/T12': 0.018169164657592773,
'LightGBM/T13': 0.026221275329589844,
'LightGBM/T14': 0.019885540008544922,
'LightGBM/T15': 0.02270817756652832,
'LightGBM/T16': 0.02164459228515625,
'LightGBM/T17': 0.023262500762939453,
'LightGBM/T18': 0.036449432373046875,
'LightGBM/T19': 0.02135944366455078,
'LightGBM/T20': 0.022690534591674805,
'LightGBM/T21': 0.024231672286987305,
'LightGBM/T22': 0.02351212501525879,
'LightGBM/T23': 0.02706122398376465,
'LightGBM/T24': 0.022077083587646484,
'LightGBM/T25': 0.023218393325805664,
'LightGBM/T26': 0.037447214126586914,
'LightGBM/T27': 0.029222488403320312,
'LightGBM/T28': 0.027306556701660156,
'LightGBM/T29': 0.01905965805053711,
'LightGBM/T30': 0.023558378219604492,
'LightGBM/T31': 0.01999831199645996,
'LightGBM/T32': 0.017093420028686523,
'LightGBM/T33': 0.02092123031616211,
'LightGBM/T34': 0.01641225814819336,
'LightGBM/T35': 0.021066665649414062,
'LightGBM/T36': 0.02031683921813965,
'LightGBM/T37': 0.02385115623474121,
'LightGBM/T38': 0.02126002311706543,
'LightGBM/T39': 0.017766475677490234,
'LightGBM/T40': 0.025960922241210938,
'LightGBM/T41': 0.023407697677612305,
'LightGBM/T42': 0.021120548248291016,
'LightGBM/T43': 0.024018287658691406,
'LightGBM/T44': 0.02111077308654785,
'LightGBM/T45': 0.019781827926635742,
'LightGBM/T46': 0.029800891876220703,
'LightGBM/T47': 0.02619791030883789,
'LightGBM/T48': 0.020375490188598633,
'LightGBM/T49': 0.03079533576965332,
'LightGBM/T50': 0.02525782585144043,
'RandomForest/T1': 0.4403500556945801,
'RandomForest/T2': 0.4383115768432617,
'RandomForest/T3': 0.6104488372802734,
'RandomForest/T4': 0.9070084095001221,
'RandomForest/T5': 0.5189633369445801,
'RandomForest/T6': 0.9457070827484131,
'RandomForest/T7': 0.9131748676300049,
'RandomForest/T8': 1.2235233783721924,
'RandomForest/T9': 1.186030626296997,
'RandomForest/T10': 0.9738218784332275,
'RandomForest/T11': 1.1139793395996094,
'RandomForest/T12': 0.8084595203399658,
'RandomForest/T13': 0.6579334735870361,
'RandomForest/T14': 0.3136730194091797,
'RandomForest/T15': 0.25371646881103516,
'RandomForest/T16': 0.46750831604003906,
'RandomForest/T17': 0.9967317581176758,
'RandomForest/T18': 0.5409502983093262,
'RandomForest/T19': 0.3212299346923828,
'RandomForest/T20': 0.1332077980041504,
'RandomForest/T21': 1.1020987033843994,
'RandomForest/T22': 1.0382347106933594,
'RandomForest/T23': 0.1674363613128662,
'RandomForest/T24': 0.16875028610229492,
'RandomForest/T25': 1.455045223236084,
'RandomForest/T26': 0.4499049186706543,
'RandomForest/T27': 0.983769416809082,
'RandomForest/T28': 0.21500945091247559,
'RandomForest/T29': 1.0772531032562256,
'RandomForest/T30': 0.5103616714477539,
'RandomForest/T31': 1.0433008670806885,
'RandomForest/T32': 0.40979766845703125,
'RandomForest/T33': 0.9661316871643066,
'RandomForest/T34': 1.177184820175171,
'RandomForest/T35': 0.6026790142059326,
'RandomForest/T36': 0.6436810493469238,
'RandomForest/T37': 0.3409302234649658,
'RandomForest/T38': 0.19833111763000488,
'RandomForest/T39': 0.17534136772155762,
'RandomForest/T40': 0.6871397495269775,
'RandomForest/T41': 0.32127809524536133,
'RandomForest/T42': 0.40953922271728516,
'RandomForest/T43': 1.3772809505462646,
'RandomForest/T44': 1.2342073917388916,
'RandomForest/T45': 0.4446451663970947,
'RandomForest/T46': 0.47545313835144043,
'RandomForest/T47': 0.3685801029205322,
'RandomForest/T48': 0.7162630558013916,
'RandomForest/T49': 0.7370657920837402,
'RandomForest/T50': 0.6903126239776611,
'XGBoost/T1': 0.07568836212158203,
'XGBoost/T2': 0.12128973007202148,
'XGBoost/T3': 0.31340527534484863,
'XGBoost/T4': 0.06524109840393066,
'XGBoost/T5': 0.24658918380737305,
'XGBoost/T6': 0.04497075080871582,
'XGBoost/T7': 0.45288753509521484,
'XGBoost/T8': 0.15218877792358398,
'XGBoost/T9': 0.04367709159851074,
'XGBoost/T10': 0.021458864212036133,
'XGBoost/T11': 0.23934721946716309,
'XGBoost/T12': 0.02824854850769043,
'XGBoost/T13': 0.4187483787536621,
'XGBoost/T14': 0.6117968559265137,
'XGBoost/T15': 0.03880190849304199,
'XGBoost/T16': 0.11036157608032227,
'XGBoost/T17': 0.05531477928161621,
'XGBoost/T18': 0.05942082405090332,
'XGBoost/T19': 0.10197615623474121,
'XGBoost/T20': 0.05215787887573242,
'XGBoost/T21': 0.0702359676361084,
'XGBoost/T22': 0.033405303955078125,
'XGBoost/T23': 0.02597832679748535,
'XGBoost/T24': 0.05988359451293945,
'XGBoost/T25': 0.1872706413269043,
'XGBoost/T26': 0.06990504264831543,
'XGBoost/T27': 0.3032870292663574,
'XGBoost/T28': 0.6551320552825928,
'XGBoost/T29': 0.09547996520996094,
'XGBoost/T30': 0.0216367244720459,
'XGBoost/T31': 0.02985358238220215,
'XGBoost/T32': 0.060682058334350586,
'XGBoost/T33': 0.22440791130065918,
'XGBoost/T34': 0.4357118606567383,
'XGBoost/T35': 0.046584367752075195,
'XGBoost/T36': 0.14362025260925293,
'XGBoost/T37': 0.07735896110534668,
'XGBoost/T38': 0.021435260772705078,
'XGBoost/T39': 0.34052062034606934,
'XGBoost/T40': 0.08880209922790527,
'XGBoost/T41': 0.8909916877746582,
'XGBoost/T42': 0.20623397827148438,
'XGBoost/T43': 0.0720667839050293,
'XGBoost/T44': 0.2620220184326172,
'XGBoost/T45': 0.10295748710632324,
'XGBoost/T46': 0.10166645050048828,
'XGBoost/T47': 0.026499271392822266,
'XGBoost/T48': 0.17629027366638184,
'XGBoost/T49': 0.12980055809020996,
'XGBoost/T50': 0.18116307258605957,
'WeightedEnsemble_L2': 0.0006139278411865234},
'num_bag_folds': 0,
'max_stack_level': 2,
'model_hyperparams': {'LightGBM/T1': {'learning_rate': 0.1,
'num_boost_round': 100,
'num_leaves': 36,
'min_data_in_leaf': 20,
'feature_fraction': 1.0},
'LightGBM/T2': {'learning_rate': 0.0852084800119005,
'num_boost_round': 100,
'num_leaves': 29,
'min_data_in_leaf': 77,
'feature_fraction': 0.8872033759818312},
'LightGBM/T3': {'learning_rate': 0.06475526332006454,
'num_boost_round': 100,
'num_leaves': 49,
'min_data_in_leaf': 46,
'feature_fraction': 0.9618129346960314},
'LightGBM/T4': {'learning_rate': 0.17937179620011748,
'num_boost_round': 100,
'num_leaves': 27,
'min_data_in_leaf': 68,
'feature_fraction': 0.97294325019552},
'LightGBM/T5': {'learning_rate': 0.04876499175835442,
'num_boost_round': 100,
'num_leaves': 43,
'min_data_in_leaf': 98,
'feature_fraction': 0.9479312595206661},
'LightGBM/T6': {'learning_rate': 0.012371433997667478,
'num_boost_round': 100,
'num_leaves': 46,
'min_data_in_leaf': 19,
'feature_fraction': 0.9813991595731653},
'LightGBM/T7': {'learning_rate': 0.12113325685093149,
'num_boost_round': 100,
'num_leaves': 26,
'min_data_in_leaf': 89,
'feature_fraction': 0.7550545993600815},
'LightGBM/T8': {'learning_rate': 0.10957948710049178,
'num_boost_round': 100,
'num_leaves': 45,
'min_data_in_leaf': 39,
'feature_fraction': 0.994654585558191},
'LightGBM/T9': {'learning_rate': 0.0866093341267272,
'num_boost_round': 100,
'num_leaves': 35,
'min_data_in_leaf': 75,
'feature_fraction': 0.9197198825297401},
'LightGBM/T10': {'learning_rate': 0.09704690341525249,
'num_boost_round': 100,
'num_leaves': 49,
'min_data_in_leaf': 84,
'feature_fraction': 0.8843433073622526},
'LightGBM/T11': {'learning_rate': 0.10169538705876566,
'num_boost_round': 100,
'num_leaves': 54,
'min_data_in_leaf': 65,
'feature_fraction': 0.8161389030261568},
'LightGBM/T12': {'learning_rate': 0.010579035974693696,
'num_boost_round': 100,
'num_leaves': 31,
'min_data_in_leaf': 63,
'feature_fraction': 0.8921084872171621},
'LightGBM/T13': {'learning_rate': 0.06348185887563147,
'num_boost_round': 100,
'num_leaves': 30,
'min_data_in_leaf': 89,
'feature_fraction': 0.9030239306806054},
'LightGBM/T14': {'learning_rate': 0.02935827690025271,
'num_boost_round': 100,
'num_leaves': 27,
'min_data_in_leaf': 11,
'feature_fraction': 0.9204550747758709},
'LightGBM/T15': {'learning_rate': 0.18270510709257795,
'num_boost_round': 100,
'num_leaves': 44,
'min_data_in_leaf': 21,
'feature_fraction': 0.7748200875897435},
'LightGBM/T16': {'learning_rate': 0.02572656160972692,
'num_boost_round': 100,
'num_leaves': 38,
'min_data_in_leaf': 63,
'feature_fraction': 0.7822315744137134},
'LightGBM/T17': {'learning_rate': 0.03720777713148019,
'num_boost_round': 100,
'num_leaves': 32,
'min_data_in_leaf': 78,
'feature_fraction': 0.89254919260447},
'LightGBM/T18': {'learning_rate': 0.0186962680125594,
'num_boost_round': 100,
'num_leaves': 38,
'min_data_in_leaf': 13,
'feature_fraction': 0.775511202687007},
'LightGBM/T19': {'learning_rate': 0.021356986471473185,
'num_boost_round': 100,
'num_leaves': 46,
'min_data_in_leaf': 25,
'feature_fraction': 0.9132770813663496},
'LightGBM/T20': {'learning_rate': 0.016099937745347997,
'num_boost_round': 100,
'num_leaves': 47,
'min_data_in_leaf': 23,
'feature_fraction': 0.8111063980004007},
'LightGBM/T21': {'learning_rate': 0.10295580036641547,
'num_boost_round': 100,
'num_leaves': 26,
'min_data_in_leaf': 45,
'feature_fraction': 0.829300435517324},
'LightGBM/T22': {'learning_rate': 0.010414947978134175,
'num_boost_round': 100,
'num_leaves': 62,
'min_data_in_leaf': 60,
'feature_fraction': 0.9156317167375111},
'LightGBM/T23': {'learning_rate': 0.18638160768249964,
'num_boost_round': 100,
'num_leaves': 39,
'min_data_in_leaf': 52,
'feature_fraction': 0.7740246019734908},
'LightGBM/T24': {'learning_rate': 0.03863417751056239,
'num_boost_round': 100,
'num_leaves': 36,
'min_data_in_leaf': 10,
'feature_fraction': 0.7639286734254016},
'LightGBM/T25': {'learning_rate': 0.014334393803661175,
'num_boost_round': 100,
'num_leaves': 28,
'min_data_in_leaf': 33,
'feature_fraction': 0.8207017406441024},
'LightGBM/T26': {'learning_rate': 0.13980591376496246,
'num_boost_round': 100,
'num_leaves': 44,
'min_data_in_leaf': 77,
'feature_fraction': 0.9221652957014426},
'LightGBM/T27': {'learning_rate': 0.13351325426964994,
'num_boost_round': 100,
'num_leaves': 53,
'min_data_in_leaf': 60,
'feature_fraction': 0.8912972166512189},
'LightGBM/T28': {'learning_rate': 0.15792601676022927,
'num_boost_round': 100,
'num_leaves': 36,
'min_data_in_leaf': 46,
'feature_fraction': 0.9791807385048652},
'LightGBM/T29': {'learning_rate': 0.010284267246614846,
'num_boost_round': 100,
'num_leaves': 42,
'min_data_in_leaf': 12,
'feature_fraction': 0.8194296403202581},
'LightGBM/T30': {'learning_rate': 0.023797208069667456,
'num_boost_round': 100,
'num_leaves': 66,
'min_data_in_leaf': 29,
'feature_fraction': 0.9290818010296413},
'LightGBM/T31': {'learning_rate': 0.01550690498759594,
'num_boost_round': 100,
'num_leaves': 29,
'min_data_in_leaf': 12,
'feature_fraction': 0.842452023187085},
'LightGBM/T32': {'learning_rate': 0.023731025043582137,
'num_boost_round': 100,
'num_leaves': 34,
'min_data_in_leaf': 50,
'feature_fraction': 0.9259343198224791},
'LightGBM/T33': {'learning_rate': 0.05892021342087588,
'num_boost_round': 100,
'num_leaves': 29,
'min_data_in_leaf': 76,
'feature_fraction': 0.8940393336044592},
'LightGBM/T34': {'learning_rate': 0.038170120361146205,
'num_boost_round': 100,
'num_leaves': 29,
'min_data_in_leaf': 14,
'feature_fraction': 0.9881872528792462},
'LightGBM/T35': {'learning_rate': 0.024376670942419982,
'num_boost_round': 100,
'num_leaves': 37,
'min_data_in_leaf': 87,
'feature_fraction': 0.924869818829376},
'LightGBM/T36': {'learning_rate': 0.05704971024279554,
'num_boost_round': 100,
'num_leaves': 51,
'min_data_in_leaf': 31,
'feature_fraction': 0.9702757992777904},
'LightGBM/T37': {'learning_rate': 0.07215200288688035,
'num_boost_round': 100,
'num_leaves': 45,
'min_data_in_leaf': 71,
'feature_fraction': 0.8971162712671874},
'LightGBM/T38': {'learning_rate': 0.035599847041745736,
'num_boost_round': 100,
'num_leaves': 47,
'min_data_in_leaf': 80,
'feature_fraction': 0.9109975498074093},
'LightGBM/T39': {'learning_rate': 0.023696539842092294,
'num_boost_round': 100,
'num_leaves': 41,
'min_data_in_leaf': 34,
'feature_fraction': 0.9290186328071608},
'LightGBM/T40': {'learning_rate': 0.13896090994714988,
'num_boost_round': 100,
'num_leaves': 37,
'min_data_in_leaf': 89,
'feature_fraction': 0.9372924592881812},
'LightGBM/T41': {'learning_rate': 0.028881871222328978,
'num_boost_round': 100,
'num_leaves': 57,
'min_data_in_leaf': 11,
'feature_fraction': 0.7730934723652245},
'LightGBM/T42': {'learning_rate': 0.18224193636955568,
'num_boost_round': 100,
'num_leaves': 29,
'min_data_in_leaf': 34,
'feature_fraction': 0.758406273374708},
'LightGBM/T43': {'learning_rate': 0.015268225925104833,
'num_boost_round': 100,
'num_leaves': 64,
'min_data_in_leaf': 86,
'feature_fraction': 0.8053156712959457},
'LightGBM/T44': {'learning_rate': 0.021812796912286877,
'num_boost_round': 100,
'num_leaves': 49,
'min_data_in_leaf': 79,
'feature_fraction': 0.9960105603410366},
'LightGBM/T45': {'learning_rate': 0.013475421907461791,
'num_boost_round': 100,
'num_leaves': 36,
'min_data_in_leaf': 94,
'feature_fraction': 0.8619815429676825},
'LightGBM/T46': {'learning_rate': 0.124264918524181,
'num_boost_round': 100,
'num_leaves': 66,
'min_data_in_leaf': 38,
'feature_fraction': 0.8673122934975606},
'LightGBM/T47': {'learning_rate': 0.06322101400803244,
'num_boost_round': 100,
'num_leaves': 46,
'min_data_in_leaf': 33,
'feature_fraction': 0.7906232336690937},
'LightGBM/T48': {'learning_rate': 0.01520927569419526,
'num_boost_round': 100,
'num_leaves': 35,
'min_data_in_leaf': 49,
'feature_fraction': 0.8528491809388636},
'LightGBM/T49': {'learning_rate': 0.03059584582943981,
'num_boost_round': 100,
'num_leaves': 52,
'min_data_in_leaf': 56,
'feature_fraction': 0.9956437399562984},
'LightGBM/T50': {'learning_rate': 0.029889689088521656,
'num_boost_round': 100,
'num_leaves': 36,
'min_data_in_leaf': 19,
'feature_fraction': 0.7626470287635593},
'RandomForest/T1': {'n_estimators': 200,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 10,
'min_samples_leaf': 1,
'max_features': 'sqrt'},
'RandomForest/T2': {'n_estimators': 292,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 17,
'min_samples_leaf': 6,
'max_features': 'log2'},
'RandomForest/T3': {'n_estimators': 459,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 8,
'min_samples_leaf': 4,
'max_features': 'log2'},
'RandomForest/T4': {'n_estimators': 854,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 14,
'min_samples_leaf': 6,
'max_features': 'log2'},
'RandomForest/T5': {'n_estimators': 572,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 9,
'min_samples_leaf': 7,
'max_features': 'log2'},
'RandomForest/T6': {'n_estimators': 586,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 13,
'min_samples_leaf': 2,
'max_features': 'sqrt'},
'RandomForest/T7': {'n_estimators': 949,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 12,
'min_samples_leaf': 9,
'max_features': 'log2'},
'RandomForest/T8': {'n_estimators': 877,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 10,
'min_samples_leaf': 9,
'max_features': 'log2'},
'RandomForest/T9': {'n_estimators': 855,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 9,
'min_samples_leaf': 1,
'max_features': 'log2'},
'RandomForest/T10': {'n_estimators': 950,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 10,
'min_samples_leaf': 1,
'max_features': 'sqrt'},
'RandomForest/T11': {'n_estimators': 855,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 8,
'min_samples_leaf': 2,
'max_features': 'sqrt'},
'RandomForest/T12': {'n_estimators': 523,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 18,
'min_samples_leaf': 4,
'max_features': 'log2'},
'RandomForest/T13': {'n_estimators': 797,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 5,
'min_samples_leaf': 10,
'max_features': 'log2'},
'RandomForest/T14': {'n_estimators': 251,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 20,
'min_samples_leaf': 5,
'max_features': 'sqrt'},
'RandomForest/T15': {'n_estimators': 283,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 8,
'min_samples_leaf': 3,
'max_features': 'sqrt'},
'RandomForest/T16': {'n_estimators': 228,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 17,
'min_samples_leaf': 1,
'max_features': 'sqrt'},
'RandomForest/T17': {'n_estimators': 650,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 9,
'min_samples_leaf': 6,
'max_features': 'log2'},
'RandomForest/T18': {'n_estimators': 435,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 13,
'min_samples_leaf': 2,
'max_features': 'sqrt'},
'RandomForest/T19': {'n_estimators': 357,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 9,
'min_samples_leaf': 9,
'max_features': 'log2'},
'RandomForest/T20': {'n_estimators': 157,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 6,
'min_samples_leaf': 10,
'max_features': 'log2'},
'RandomForest/T21': {'n_estimators': 879,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 8,
'min_samples_leaf': 8,
'max_features': 'sqrt'},
'RandomForest/T22': {'n_estimators': 498,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 19,
'min_samples_leaf': 1,
'max_features': 'sqrt'},
'RandomForest/T23': {'n_estimators': 184,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 8,
'min_samples_leaf': 10,
'max_features': 'log2'},
'RandomForest/T24': {'n_estimators': 147,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 16,
'min_samples_leaf': 7,
'max_features': 'sqrt'},
'RandomForest/T25': {'n_estimators': 968,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 20,
'min_samples_leaf': 4,
'max_features': 'sqrt'},
'RandomForest/T26': {'n_estimators': 327,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 9,
'min_samples_leaf': 5,
'max_features': 'sqrt'},
'RandomForest/T27': {'n_estimators': 953,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 15,
'min_samples_leaf': 6,
'max_features': 'log2'},
'RandomForest/T28': {'n_estimators': 269,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 5,
'min_samples_leaf': 6,
'max_features': 'log2'},
'RandomForest/T29': {'n_estimators': 706,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 8,
'min_samples_leaf': 6,
'max_features': 'sqrt'},
'RandomForest/T30': {'n_estimators': 392,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 5,
'min_samples_leaf': 3,
'max_features': 'log2'},
'RandomForest/T31': {'n_estimators': 710,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 7,
'min_samples_leaf': 4,
'max_features': 'sqrt'},
'RandomForest/T32': {'n_estimators': 301,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 15,
'min_samples_leaf': 1,
'max_features': 'log2'},
'RandomForest/T33': {'n_estimators': 655,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 20,
'min_samples_leaf': 3,
'max_features': 'sqrt'},
'RandomForest/T34': {'n_estimators': 799,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 15,
'min_samples_leaf': 8,
'max_features': 'log2'},
'RandomForest/T35': {'n_estimators': 674,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 7,
'min_samples_leaf': 3,
'max_features': 'log2'},
'RandomForest/T36': {'n_estimators': 694,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 8,
'min_samples_leaf': 4,
'max_features': 'sqrt'},
'RandomForest/T37': {'n_estimators': 309,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 19,
'min_samples_leaf': 5,
'max_features': 'log2'},
'RandomForest/T38': {'n_estimators': 158,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 7,
'min_samples_leaf': 10,
'max_features': 'log2'},
'RandomForest/T39': {'n_estimators': 143,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 6,
'min_samples_leaf': 7,
'max_features': 'sqrt'},
'RandomForest/T40': {'n_estimators': 407,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 13,
'min_samples_leaf': 3,
'max_features': 'log2'},
'RandomForest/T41': {'n_estimators': 228,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 5,
'min_samples_leaf': 7,
'max_features': 'sqrt'},
'RandomForest/T42': {'n_estimators': 284,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 11,
'min_samples_leaf': 4,
'max_features': 'log2'},
'RandomForest/T43': {'n_estimators': 737,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 17,
'min_samples_leaf': 9,
'max_features': 'sqrt'},
'RandomForest/T44': {'n_estimators': 975,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 7,
'min_samples_leaf': 3,
'max_features': 'log2'},
'RandomForest/T45': {'n_estimators': 428,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 18,
'min_samples_leaf': 9,
'max_features': 'sqrt'},
'RandomForest/T46': {'n_estimators': 510,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 8,
'min_samples_leaf': 9,
'max_features': 'log2'},
'RandomForest/T47': {'n_estimators': 423,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 7,
'min_samples_leaf': 5,
'max_features': 'sqrt'},
'RandomForest/T48': {'n_estimators': 360,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 17,
'min_samples_leaf': 1,
'max_features': 'log2'},
'RandomForest/T49': {'n_estimators': 561,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 8,
'min_samples_leaf': 7,
'max_features': 'log2'},
'RandomForest/T50': {'n_estimators': 628,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 14,
'min_samples_leaf': 9,
'max_features': 'log2'},
'XGBoost/T1': {'n_estimators': 200,
'learning_rate': 0.1,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 10,
'min_child_weight': 1,
'subsample': 1,
'colsample_bytree': 1},
'XGBoost/T2': {'n_estimators': 459,
'learning_rate': 0.0852084800119005,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 8,
'min_child_weight': 8.72151055860481,
'subsample': 0.7118273996694524,
'colsample_bytree': 0.7744067519636624},
'XGBoost/T3': {'n_estimators': 496,
'learning_rate': 0.03709488999964027,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 11,
'min_child_weight': 1.5104167958569885,
'subsample': 0.6917207594128889,
'colsample_bytree': 0.8229470565333281},
'XGBoost/T4': {'n_estimators': 637,
'learning_rate': 0.04876499175835442,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 13,
'min_child_weight': 4.535063164907467,
'subsample': 0.5355180290989434,
'colsample_bytree': 0.8958625190413323},
'XGBoost/T5': {'n_estimators': 947,
'learning_rate': 0.010624408033089028,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 8,
'min_child_weight': 9.614396430577417,
'subsample': 0.9350060741234096,
'colsample_bytree': 0.5435646498507704},
'XGBoost/T6': {'n_estimators': 247,
'learning_rate': 0.10957948710049178,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 8,
'min_child_weight': 5.684297315960843,
'subsample': 0.5591372129344666,
'colsample_bytree': 0.989309171116382},
'XGBoost/T7': {'n_estimators': 814,
'learning_rate': 0.015364092985079616,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 20,
'min_child_weight': 7.827540618901215,
'subsample': 0.7073309699952618,
'colsample_bytree': 0.819960510663762},
'XGBoost/T8': {'n_estimators': 228,
'learning_rate': 0.10169538705876566,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 12,
'min_child_weight': 2.948953189819347,
'subsample': 0.5093949002181776,
'colsample_bytree': 0.6322778060523135},
'XGBoost/T9': {'n_estimators': 488,
'learning_rate': 0.06256837783419192,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 9,
'min_child_weight': 4.478400830132758,
'subsample': 0.8409101495517417,
'colsample_bytree': 0.8088177485379385},
'XGBoost/T10': {'n_estimators': 157,
'learning_rate': 0.037033237549985214,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 6,
'min_child_weight': 1.8935231532307648,
'subsample': 0.8333833577228338,
'colsample_bytree': 0.679753950286893},
'XGBoost/T11': {'n_estimators': 711,
'learning_rate': 0.01878079717563113,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 16,
'min_child_weight': 7.756175270966106,
'subsample': 0.6818553854713113,
'colsample_bytree': 0.8353189348090797},
'XGBoost/T12': {'n_estimators': 147,
'learning_rate': 0.03720777713148019,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 9,
'min_child_weight': 9.630543417620682,
'subsample': 0.6044383780474174,
'colsample_bytree': 0.7850983852089398},
'XGBoost/T13': {'n_estimators': 760,
'learning_rate': 0.0707478948788083,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 13,
'min_child_weight': 4.729317294037319,
'subsample': 0.6222127960008014,
'colsample_bytree': 0.5806547589424982},
'XGBoost/T14': {'n_estimators': 681,
'learning_rate': 0.013918786377459117,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 10,
'min_child_weight': 3.8548156786236647,
'subsample': 0.5982911808400267,
'colsample_bytree': 0.5794847918227599},
'XGBoost/T15': {'n_estimators': 392,
'learning_rate': 0.11698678831285687,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 5,
'min_child_weight': 6.605614859920025,
'subsample': 0.5480492039469815,
'colsample_bytree': 0.684362585330482},
'XGBoost/T16': {'n_estimators': 301,
'learning_rate': 0.04071264054824341,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 18,
'min_child_weight': 1.501432243314457,
'subsample': 0.5099938327043794,
'colsample_bytree': 0.9882297325066979},
'XGBoost/T17': {'n_estimators': 230,
'learning_rate': 0.18813591829048096,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 20,
'min_child_weight': 3.665261777699304,
'subsample': 0.8443305914028851,
'colsample_bytree': 0.7208554606244226},
'XGBoost/T18': {'n_estimators': 839,
'learning_rate': 0.15654957207501224,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 8,
'min_child_weight': 7.232249074330179,
'subsample': 0.9325512806526925,
'colsample_bytree': 0.9402379446262977},
'XGBoost/T19': {'n_estimators': 878,
'learning_rate': 0.15584183773279003,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 15,
'min_child_weight': 6.1835184600056134,
'subsample': 0.6388592806405162,
'colsample_bytree': 0.7544844803335072},
'XGBoost/T20': {'n_estimators': 228,
'learning_rate': 0.12471316823476494,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 5,
'min_child_weight': 7.44694483706709,
'subsample': 0.6323650821417248,
'colsample_bytree': 0.5046783524282663},
'XGBoost/T21': {'n_estimators': 386,
'learning_rate': 0.052388664511416556,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 17,
'min_child_weight': 8.460460262956268,
'subsample': 0.5732208813645561,
'colsample_bytree': 0.6989103763793145},
'XGBoost/T22': {'n_estimators': 396,
'learning_rate': 0.08233477621344185,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 16,
'min_child_weight': 9.659696906056944,
'subsample': 0.8780533469325205,
'colsample_bytree': 0.7848092029616534},
'XGBoost/T23': {'n_estimators': 929,
'learning_rate': 0.14647814095091793,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 13,
'min_child_weight': 3.0077346937655647,
'subsample': 0.7235626893088136,
'colsample_bytree': 0.6980491377116811},
'XGBoost/T24': {'n_estimators': 989,
'learning_rate': 0.08129119700295492,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 17,
'min_child_weight': 7.903213332028541,
'subsample': 0.6982528704234923,
'colsample_bytree': 0.923204336235564},
'XGBoost/T25': {'n_estimators': 564,
'learning_rate': 0.05704971024279554,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 10,
'min_child_weight': 9.688745984679343,
'subsample': 0.8626271399098202,
'colsample_bytree': 0.9405515985555808},
'XGBoost/T26': {'n_estimators': 328,
'learning_rate': 0.17534504067501946,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 6,
'min_child_weight': 4.5538236410520625,
'subsample': 0.8031966070639622,
'colsample_bytree': 0.7506621909633511},
'XGBoost/T27': {'n_estimators': 252,
'learning_rate': 0.02468072326194478,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 18,
'min_child_weight': 3.5919190392695244,
'subsample': 0.8090077144994208,
'colsample_bytree': 0.5095965991546667},
'XGBoost/T28': {'n_estimators': 449,
'learning_rate': 0.015005684851667521,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 16,
'min_child_weight': 1.8313650051480836,
'subsample': 0.7954363806240866,
'colsample_bytree': 0.7143843504728831},
'XGBoost/T29': {'n_estimators': 871,
'learning_rate': 0.0707675010184323,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 18,
'min_child_weight': 3.8889751723614774,
'subsample': 0.6837809350239483,
'colsample_bytree': 0.7871626244247893},
'XGBoost/T30': {'n_estimators': 655,
'learning_rate': 0.14468351085719264,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 18,
'min_child_weight': 5.833202692382017,
'subsample': 0.5501134436561506,
'colsample_bytree': 0.7179324626328134},
'XGBoost/T31': {'n_estimators': 768,
'learning_rate': 0.08496681811516953,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 16,
'min_child_weight': 8.570261168070395,
'subsample': 0.9340630286841072,
'colsample_bytree': 0.9597413068723368},
'XGBoost/T32': {'n_estimators': 274,
'learning_rate': 0.06322101400803244,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 12,
'min_child_weight': 8.01145917648132,
'subsample': 0.9036594793625053,
'colsample_bytree': 0.5812464673381874},
'XGBoost/T33': {'n_estimators': 999,
'learning_rate': 0.03386551492392295,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 14,
'min_child_weight': 9.843174638426742,
'subsample': 0.7100376848953054,
'colsample_bytree': 0.7845503693072966},
'XGBoost/T34': {'n_estimators': 529,
'learning_rate': 0.029889689088521656,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 14,
'min_child_weight': 8.7022300815335,
'subsample': 0.8824558494984286,
'colsample_bytree': 0.5252940575271186},
'XGBoost/T35': {'n_estimators': 133,
'learning_rate': 0.09457394806295376,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 7,
'min_child_weight': 5.689329455837163,
'subsample': 0.6694925583898892,
'colsample_bytree': 0.9720617597177272},
'XGBoost/T36': {'n_estimators': 354,
'learning_rate': 0.016690090582560147,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 7,
'min_child_weight': 3.015322192543421,
'subsample': 0.9640406467327954,
'colsample_bytree': 0.5897451304589278},
'XGBoost/T37': {'n_estimators': 447,
'learning_rate': 0.011000778045943141,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 5,
'min_child_weight': 5.500236679563713,
'subsample': 0.7886142943020837,
'colsample_bytree': 0.8522072009617664},
'XGBoost/T38': {'n_estimators': 214,
'learning_rate': 0.16422543982875826,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 12,
'min_child_weight': 7.111815540646726,
'subsample': 0.7949549881772855,
'colsample_bytree': 0.6189464106872543},
'XGBoost/T39': {'n_estimators': 614,
'learning_rate': 0.02545949565249501,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 18,
'min_child_weight': 2.9115144900127774,
'subsample': 0.593096502940168,
'colsample_bytree': 0.8650610147583848},
'XGBoost/T40': {'n_estimators': 441,
'learning_rate': 0.09165955990241643,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 7,
'min_child_weight': 9.732130147864906,
'subsample': 0.6271782408851965,
'colsample_bytree': 0.9721861949919668},
'XGBoost/T41': {'n_estimators': 960,
'learning_rate': 0.036744222350501066,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 11,
'min_child_weight': 8.929041059815074,
'subsample': 0.6888759196462404,
'colsample_bytree': 0.5290145801619378},
'XGBoost/T42': {'n_estimators': 598,
'learning_rate': 0.01076732360998369,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 10,
'min_child_weight': 7.69451090778809,
'subsample': 0.7268484222780227,
'colsample_bytree': 0.5898018387798174},
'XGBoost/T43': {'n_estimators': 675,
'learning_rate': 0.14675612923216924,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 14,
'min_child_weight': 5.225474647820539,
'subsample': 0.9532090435300855,
'colsample_bytree': 0.7682896055543611},
'XGBoost/T44': {'n_estimators': 236,
'learning_rate': 0.019871068246332103,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 20,
'min_child_weight': 3.8801543574022106,
'subsample': 0.8228922992509495,
'colsample_bytree': 0.5686102100485972},
'XGBoost/T45': {'n_estimators': 555,
'learning_rate': 0.04744165730141579,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 5,
'min_child_weight': 8.853855899026557,
'subsample': 0.712725768976088,
'colsample_bytree': 0.6623414860332326},
'XGBoost/T46': {'n_estimators': 832,
'learning_rate': 0.0766555433261059,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 12,
'min_child_weight': 7.187394487490337,
'subsample': 0.8943697137660183,
'colsample_bytree': 0.9426688298047927},
'XGBoost/T47': {'n_estimators': 424,
'learning_rate': 0.13973126370783961,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 17,
'min_child_weight': 2.919807796307338,
'subsample': 0.7358757854464006,
'colsample_bytree': 0.6147209172355227},
'XGBoost/T48': {'n_estimators': 356,
'learning_rate': 0.015847505128111863,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 18,
'min_child_weight': 4.36752982300803,
'subsample': 0.6074403683752891,
'colsample_bytree': 0.8557919085304069},
'XGBoost/T49': {'n_estimators': 382,
'learning_rate': 0.11237925629585807,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 12,
'min_child_weight': 2.057786703658298,
'subsample': 0.6384468754085455,
'colsample_bytree': 0.5932291097446692},
'XGBoost/T50': {'n_estimators': 473,
'learning_rate': 0.08251675509257471,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 8,
'min_child_weight': 6.088791806726581,
'subsample': 0.6024328810902229,
'colsample_bytree': 0.5874544371547538},
'WeightedEnsemble_L2': {'use_orig_features': False,
'max_base_models': 25,
'max_base_models_per_type': 5,
'save_bag_folds': True}},
'leaderboard': model score_val pred_time_val fit_time \
0 WeightedEnsemble_L2 -34.824982 1.056576 30.332862
1 XGBoost/T7 -35.370728 0.452888 10.724385
2 XGBoost/T48 -35.381646 0.176290 4.593190
3 XGBoost/T11 -35.451044 0.239347 6.370124
4 XGBoost/T33 -35.709377 0.224408 4.632751
.. ... ... ... ...
146 RandomForest/T39 -100.382603 0.175341 1.166851
147 RandomForest/T28 -107.393671 0.215009 0.986697
148 RandomForest/T13 -107.426838 0.657933 3.204717
149 RandomForest/T30 -107.446557 0.510362 2.800887
150 RandomForest/T41 -107.462009 0.321278 1.722017
pred_time_val_marginal fit_time_marginal stack_level can_infer \
0 0.000614 0.539521 2 True
1 0.452888 10.724385 1 True
2 0.176290 4.593190 1 True
3 0.239347 6.370124 1 True
4 0.224408 4.632751 1 True
.. ... ... ... ...
146 0.175341 1.166851 1 True
147 0.215009 0.986697 1 True
148 0.657933 3.204717 1 True
149 0.510362 2.800887 1 True
150 0.321278 1.722017 1 True
fit_order
0 151
1 107
2 148
3 111
4 133
.. ...
146 89
147 78
148 63
149 80
150 91
[151 rows x 9 columns]}
predictor_new_hpo.leaderboard(silent=True).plot(kind="bar", x="model", y="score_val")
<Axes: xlabel='model'>
predictions = predictor_new_hpo.predict(test)
predictions[predictions<0].sum()
-13.688927
predictions[predictions<0]=0
# Same submitting predictions
submission_new_hpo =submission.copy()
submission_new_hpo["count"] = predictions
submission_new_hpo.to_csv("submission_new_hpo.csv", index=False)
!kaggle competitions submit -c bike-sharing-demand -f submission_new_hpo.csv -m "new features with hyperparameters"
100% 188k/188k [00:02<00:00, 79.8kB/s] Successfully submitted to Bike Sharing Demand
!kaggle competitions submissions -c bike-sharing-demand | tail -n +1 | head -n 6
fileName date description status publicScore privateScore --------------------------- ------------------- ---------------------------------- -------- ----------- ------------ submission_new_hpo.csv 2023-05-26 16:15:07 new features with hyperparameters complete 0.46765 0.46765 submission_new_features.csv 2023-05-26 16:04:20 new features complete 0.65044 0.65044 submission.csv 2023-05-26 15:53:07 first raw submission complete 1.80414 1.80414 submission_new_features.csv 2023-05-26 15:25:32 new features complete 0.71055 0.71055
hyperparameter and hyperparameter_tune_kwargs arguments.# Define hyperparameters for different models
hyperparameters = {
'GBM': {'num_boost_round': ag.space.Int(lower=50, upper=500, default=100),
'num_leaves': ag.space.Int(lower=10, upper=100, default=36)},
'NN': {'num_epochs': ag.space.Int(lower=5, upper=20, default=10),
'learning_rate': ag.space.Real(1e-4, 1e-2, default=5e-4, log=True),
'activation': ag.space.Categorical('relu', 'softrelu', 'tanh'),
'layers': ag.space.Categorical([50], [100], [200, 100], [300, 200, 100]),
'dropout_prob': ag.space.Real(0.0, 0.5, default=0.1)},
'RF': {'n_estimators': ag.space.Int(lower=100, upper=1000, default=200),
'max_depth': ag.space.Int(lower=5, upper=20, default=10),
'min_samples_leaf': ag.space.Int(lower=1, upper=10, default=1)},
'XGB': {'n_estimators': ag.space.Int(lower=100, upper=1000, default=200),
'max_depth': ag.space.Int(lower=5, upper=20, default=10),
'learning_rate': ag.space.Real(1e-4, 1e-2, default=5e-4, log=True)}
}
# Define hyperparameter tuning settings
hyperparameter_tune_kwargs = {
'scheduler': 'local',
'searcher': 'random',
'num_trials': 50,
'time_limits':600,
}
# Fit the model with hyperparameter tuning
predictor_new_hpo1 = TabularPredictor(label="count", eval_metric="root_mean_squared_error",path='/content/drive/MyDrive',
learner_kwargs={"ignored_columns": ["casual", "registered"]})
predictor_new_hpo1.fit(train_data=train, hyperparameters=hyperparameters,
hyperparameter_tune_kwargs=hyperparameter_tune_kwargs,ag_args_fit={'num_gpus': 1})
Warning: path already exists! This predictor may overwrite an existing predictor! path="/content/drive/MyDrive"
Warning: hyperparameter tuning is currently experimental and may cause the process to hang.
Beginning AutoGluon training ...
AutoGluon will save models to "/content/drive/MyDrive/"
AutoGluon Version: 0.7.0
Python Version: 3.10.11
Operating System: Linux
Platform Machine: x86_64
Platform Version: #1 SMP Sat Apr 29 09:15:28 UTC 2023
Train Data Rows: 10886
Train Data Columns: 14
Label Column: count
Preprocessing data ...
AutoGluon infers your prediction problem is: 'regression' (because dtype of label-column == int and many unique label-values observed).
Label info (max, min, mean, stddev): (977, 1, 191.57413, 181.14445)
If 'regression' is not the correct problem_type, please manually specify the problem_type parameter during predictor init (You may specify problem_type as one of: ['binary', 'multiclass', 'regression'])
Using Feature Generators to preprocess the data ...
Dropping user-specified ignored columns: ['casual', 'registered']
Fitting AutoMLPipelineFeatureGenerator...
Available Memory: 9766.88 MB
Train Data (Original) Memory Usage: 0.89 MB (0.0% of available memory)
Inferring data type of each feature based on column values. Set feature_metadata_in to manually specify special dtypes of the features.
Stage 1 Generators:
Fitting AsTypeFeatureGenerator...
Note: Converting 2 features to boolean dtype as they only contain 2 unique values.
Stage 2 Generators:
Fitting FillNaFeatureGenerator...
Stage 3 Generators:
Fitting IdentityFeatureGenerator...
Fitting CategoryFeatureGenerator...
Fitting CategoryMemoryMinimizeFeatureGenerator...
Fitting DatetimeFeatureGenerator...
Stage 4 Generators:
Fitting DropUniqueFeatureGenerator...
Types of features in original data (raw dtype, special dtypes):
('category', []) : 2 | ['season', 'weather']
('datetime', []) : 1 | ['datetime']
('float', []) : 4 | ['temp', 'atemp', 'windspeed', 'temp_humidity']
('int', []) : 5 | ['holiday', 'workingday', 'humidity', 'hour', 'hour_squared']
Types of features in processed data (raw dtype, special dtypes):
('category', []) : 2 | ['season', 'weather']
('float', []) : 4 | ['temp', 'atemp', 'windspeed', 'temp_humidity']
('int', []) : 3 | ['humidity', 'hour', 'hour_squared']
('int', ['bool']) : 2 | ['holiday', 'workingday']
('int', ['datetime_as_int']) : 5 | ['datetime', 'datetime.year', 'datetime.month', 'datetime.day', 'datetime.dayofweek']
0.5s = Fit runtime
12 features in original data used to generate 16 features in processed data.
Train Data (Processed) Memory Usage: 1.09 MB (0.0% of available memory)
Data preprocessing and feature engineering runtime = 0.59s ...
AutoGluon will gauge predictive performance using evaluation metric: 'root_mean_squared_error'
This metric's sign has been flipped to adhere to being higher_is_better. The metric score can be multiplied by -1 to get the metric value.
To change this, specify the eval_metric parameter of Predictor()
Automatically generating train/validation split with holdout_frac=0.2, Train Rows: 8708, Val Rows: 2178
WARNING: "NN" model has been deprecated in v0.4.0 and renamed to "NN_MXNET". Starting in v0.6.0, specifying "NN" or "NN_MXNET" will raise an exception. Consider instead specifying "NN_TORCH".
Fitting 4 L1 models ...
Hyperparameter tuning model: LightGBM ...
Training LightGBM/T1 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T2 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T3 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T4 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T5 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T6 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T7 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T8 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T9 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T10 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T11 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T12 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T13 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T14 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T15 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T16 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T17 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T18 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T19 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T20 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T21 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T22 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T23 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T24 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T25 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T26 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T27 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T28 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T29 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T30 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T31 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T32 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T33 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T34 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T35 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T36 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T37 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T38 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T39 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T40 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T41 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T42 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T43 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T44 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T45 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T46 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T47 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T48 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T49 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Training LightGBM/T50 with GPU, note that this may negatively impact model quality compared to CPU training. Warning: GPU mode might not be installed for LightGBM, GPU training raised an exception. Falling back to CPU training...Refer to LightGBM GPU documentation: https://github.com/Microsoft/LightGBM/tree/master/python-package#build-gpu-versionOne possible method is: pip uninstall lightgbm -y pip install lightgbm --install-option=--gpu Fitted model: LightGBM/T1 ... -40.6405 = Validation score (-root_mean_squared_error) 1.53s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T2 ... -35.9319 = Validation score (-root_mean_squared_error) 2.28s = Training runtime 0.08s = Validation runtime Fitted model: LightGBM/T3 ... -36.4713 = Validation score (-root_mean_squared_error) 2.18s = Training runtime 0.15s = Validation runtime Fitted model: LightGBM/T4 ... -37.4446 = Validation score (-root_mean_squared_error) 1.23s = Training runtime 0.04s = Validation runtime Fitted model: LightGBM/T5 ... -42.5677 = Validation score (-root_mean_squared_error) 0.76s = Training runtime 0.03s = Validation runtime Fitted model: LightGBM/T6 ... -36.5794 = Validation score (-root_mean_squared_error) 1.53s = Training runtime 0.09s = Validation runtime Fitted model: LightGBM/T7 ... -36.1698 = Validation score (-root_mean_squared_error) 0.96s = Training runtime 0.04s = Validation runtime Fitted model: LightGBM/T8 ... -36.1908 = Validation score (-root_mean_squared_error) 1.78s = Training runtime 0.14s = Validation runtime Fitted model: LightGBM/T9 ... -102.7542 = Validation score (-root_mean_squared_error) 0.52s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T10 ... -37.4891 = Validation score (-root_mean_squared_error) 1.32s = Training runtime 0.09s = Validation runtime Fitted model: LightGBM/T11 ... -116.3701 = Validation score (-root_mean_squared_error) 0.41s = Training runtime 0.01s = Validation runtime Fitted model: LightGBM/T12 ... -36.3458 = Validation score (-root_mean_squared_error) 1.09s = Training runtime 0.09s = Validation runtime Fitted model: LightGBM/T13 ... -36.346 = Validation score (-root_mean_squared_error) 1.42s = Training runtime 0.1s = Validation runtime Fitted model: LightGBM/T14 ... -48.9157 = Validation score (-root_mean_squared_error) 0.45s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T15 ... -37.0762 = Validation score (-root_mean_squared_error) 1.8s = Training runtime 0.19s = Validation runtime Fitted model: LightGBM/T16 ... -40.9987 = Validation score (-root_mean_squared_error) 0.48s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T17 ... -39.2815 = Validation score (-root_mean_squared_error) 0.61s = Training runtime 0.04s = Validation runtime Fitted model: LightGBM/T18 ... -39.5386 = Validation score (-root_mean_squared_error) 2.64s = Training runtime 0.13s = Validation runtime Fitted model: LightGBM/T19 ... -36.9345 = Validation score (-root_mean_squared_error) 2.11s = Training runtime 0.1s = Validation runtime Fitted model: LightGBM/T20 ... -36.6217 = Validation score (-root_mean_squared_error) 2.03s = Training runtime 0.2s = Validation runtime Fitted model: LightGBM/T21 ... -38.2632 = Validation score (-root_mean_squared_error) 1.61s = Training runtime 0.1s = Validation runtime Fitted model: LightGBM/T22 ... -116.568 = Validation score (-root_mean_squared_error) 0.83s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T23 ... -40.7264 = Validation score (-root_mean_squared_error) 1.05s = Training runtime 0.04s = Validation runtime Fitted model: LightGBM/T24 ... -49.5492 = Validation score (-root_mean_squared_error) 1.86s = Training runtime 0.11s = Validation runtime Fitted model: LightGBM/T25 ... -41.4164 = Validation score (-root_mean_squared_error) 0.67s = Training runtime 0.03s = Validation runtime Fitted model: LightGBM/T26 ... -119.0068 = Validation score (-root_mean_squared_error) 0.35s = Training runtime 0.01s = Validation runtime Fitted model: LightGBM/T27 ... -37.7652 = Validation score (-root_mean_squared_error) 0.74s = Training runtime 0.04s = Validation runtime Fitted model: LightGBM/T28 ... -40.7542 = Validation score (-root_mean_squared_error) 0.5s = Training runtime 0.03s = Validation runtime Fitted model: LightGBM/T29 ... -39.4284 = Validation score (-root_mean_squared_error) 0.63s = Training runtime 0.05s = Validation runtime Fitted model: LightGBM/T30 ... -36.949 = Validation score (-root_mean_squared_error) 0.81s = Training runtime 0.05s = Validation runtime Fitted model: LightGBM/T31 ... -46.0416 = Validation score (-root_mean_squared_error) 0.84s = Training runtime 0.06s = Validation runtime Fitted model: LightGBM/T32 ... -36.8483 = Validation score (-root_mean_squared_error) 0.6s = Training runtime 0.03s = Validation runtime Fitted model: LightGBM/T33 ... -37.476 = Validation score (-root_mean_squared_error) 0.61s = Training runtime 0.04s = Validation runtime Fitted model: LightGBM/T34 ... -36.474 = Validation score (-root_mean_squared_error) 1.33s = Training runtime 0.09s = Validation runtime Fitted model: LightGBM/T35 ... -37.3988 = Validation score (-root_mean_squared_error) 1.12s = Training runtime 0.08s = Validation runtime Fitted model: LightGBM/T36 ... -36.8633 = Validation score (-root_mean_squared_error) 0.85s = Training runtime 0.06s = Validation runtime Fitted model: LightGBM/T37 ... -37.4136 = Validation score (-root_mean_squared_error) 1.21s = Training runtime 0.05s = Validation runtime Fitted model: LightGBM/T38 ... -36.4467 = Validation score (-root_mean_squared_error) 1.83s = Training runtime 0.13s = Validation runtime Fitted model: LightGBM/T39 ... -37.3866 = Validation score (-root_mean_squared_error) 1.83s = Training runtime 0.1s = Validation runtime Fitted model: LightGBM/T40 ... -37.0123 = Validation score (-root_mean_squared_error) 1.21s = Training runtime 0.05s = Validation runtime Fitted model: LightGBM/T41 ... -37.0042 = Validation score (-root_mean_squared_error) 1.02s = Training runtime 0.06s = Validation runtime Fitted model: LightGBM/T42 ... -36.7945 = Validation score (-root_mean_squared_error) 0.75s = Training runtime 0.02s = Validation runtime Fitted model: LightGBM/T43 ... -37.1194 = Validation score (-root_mean_squared_error) 1.12s = Training runtime 0.06s = Validation runtime Fitted model: LightGBM/T44 ... -47.0317 = Validation score (-root_mean_squared_error) 0.83s = Training runtime 0.03s = Validation runtime Fitted model: LightGBM/T45 ... -44.4822 = Validation score (-root_mean_squared_error) 1.16s = Training runtime 0.08s = Validation runtime Fitted model: LightGBM/T46 ... -37.0099 = Validation score (-root_mean_squared_error) 1.17s = Training runtime 0.07s = Validation runtime Fitted model: LightGBM/T47 ... -35.9249 = Validation score (-root_mean_squared_error) 1.59s = Training runtime 0.04s = Validation runtime Fitted model: LightGBM/T48 ... -42.1244 = Validation score (-root_mean_squared_error) 1.1s = Training runtime 0.07s = Validation runtime Fitted model: LightGBM/T49 ... -38.1451 = Validation score (-root_mean_squared_error) 0.8s = Training runtime 0.06s = Validation runtime Fitted model: LightGBM/T50 ... -36.717 = Validation score (-root_mean_squared_error) 1.08s = Training runtime 0.07s = Validation runtime Hyperparameter tuning model: RandomForest ...
Fitted model: RandomForest/T1 ... -46.1356 = Validation score (-root_mean_squared_error) 5.27s = Training runtime 0.32s = Validation runtime Fitted model: RandomForest/T2 ... -42.5083 = Validation score (-root_mean_squared_error) 12.64s = Training runtime 0.31s = Validation runtime Fitted model: RandomForest/T3 ... -59.1781 = Validation score (-root_mean_squared_error) 7.5s = Training runtime 0.45s = Validation runtime Fitted model: RandomForest/T4 ... -41.4804 = Validation score (-root_mean_squared_error) 15.64s = Training runtime 0.6s = Validation runtime Fitted model: RandomForest/T5 ... -71.458 = Validation score (-root_mean_squared_error) 10.3s = Training runtime 0.93s = Validation runtime Fitted model: RandomForest/T6 ... -45.4842 = Validation score (-root_mean_squared_error) 20.09s = Training runtime 0.69s = Validation runtime Fitted model: RandomForest/T7 ... -40.1194 = Validation score (-root_mean_squared_error) 28.2s = Training runtime 0.75s = Validation runtime Fitted model: RandomForest/T8 ... -44.3341 = Validation score (-root_mean_squared_error) 10.98s = Training runtime 0.73s = Validation runtime Fitted model: RandomForest/T9 ... -41.0278 = Validation score (-root_mean_squared_error) 28.14s = Training runtime 0.93s = Validation runtime Fitted model: RandomForest/T10 ... -44.3001 = Validation score (-root_mean_squared_error) 26.29s = Training runtime 0.94s = Validation runtime Fitted model: RandomForest/T11 ... -50.5938 = Validation score (-root_mean_squared_error) 22.91s = Training runtime 0.85s = Validation runtime Fitted model: RandomForest/T12 ... -45.8426 = Validation score (-root_mean_squared_error) 27.34s = Training runtime 0.94s = Validation runtime Fitted model: RandomForest/T13 ... -59.4256 = Validation score (-root_mean_squared_error) 5.08s = Training runtime 0.38s = Validation runtime Fitted model: RandomForest/T14 ... -59.3241 = Validation score (-root_mean_squared_error) 7.88s = Training runtime 0.34s = Validation runtime Fitted model: RandomForest/T15 ... -43.6234 = Validation score (-root_mean_squared_error) 10.69s = Training runtime 0.41s = Validation runtime Fitted model: RandomForest/T16 ... -89.4961 = Validation score (-root_mean_squared_error) 14.86s = Training runtime 0.99s = Validation runtime Fitted model: RandomForest/T17 ... -39.8374 = Validation score (-root_mean_squared_error) 30.82s = Training runtime 1.01s = Validation runtime Fitted model: RandomForest/T18 ... -42.0011 = Validation score (-root_mean_squared_error) 6.78s = Training runtime 0.37s = Validation runtime Fitted model: RandomForest/T19 ... -59.3108 = Validation score (-root_mean_squared_error) 10.03s = Training runtime 0.39s = Validation runtime Fitted model: RandomForest/T20 ... -40.7213 = Validation score (-root_mean_squared_error) 10.97s = Training runtime 0.36s = Validation runtime Fitted model: RandomForest/T21 ... -103.1131 = Validation score (-root_mean_squared_error) 3.26s = Training runtime 0.25s = Validation runtime Fitted model: RandomForest/T22 ... -46.4898 = Validation score (-root_mean_squared_error) 17.74s = Training runtime 0.56s = Validation runtime Fitted model: RandomForest/T23 ... -50.7585 = Validation score (-root_mean_squared_error) 10.1s = Training runtime 0.64s = Validation runtime Fitted model: RandomForest/T24 ... -51.6895 = Validation score (-root_mean_squared_error) 4.44s = Training runtime 0.19s = Validation runtime Fitted model: RandomForest/T25 ... -44.3153 = Validation score (-root_mean_squared_error) 9.84s = Training runtime 0.36s = Validation runtime Fitted model: RandomForest/T26 ... -89.8413 = Validation score (-root_mean_squared_error) 1.99s = Training runtime 0.13s = Validation runtime Fitted model: RandomForest/T27 ... -59.1899 = Validation score (-root_mean_squared_error) 4.42s = Training runtime 0.33s = Validation runtime Fitted model: RandomForest/T28 ... -40.8687 = Validation score (-root_mean_squared_error) 9.02s = Training runtime 0.34s = Validation runtime Fitted model: RandomForest/T29 ... -103.0716 = Validation score (-root_mean_squared_error) 8.47s = Training runtime 0.55s = Validation runtime Fitted model: RandomForest/T30 ... -45.199 = Validation score (-root_mean_squared_error) 3.97s = Training runtime 0.23s = Validation runtime Fitted model: RandomForest/T31 ... -41.731 = Validation score (-root_mean_squared_error) 29.86s = Training runtime 1.4s = Validation runtime Fitted model: RandomForest/T32 ... -50.8069 = Validation score (-root_mean_squared_error) 7.25s = Training runtime 0.32s = Validation runtime Fitted model: RandomForest/T33 ... -41.8808 = Validation score (-root_mean_squared_error) 6.88s = Training runtime 0.31s = Validation runtime Fitted model: RandomForest/T34 ... -42.3016 = Validation score (-root_mean_squared_error) 12.01s = Training runtime 0.52s = Validation runtime Fitted model: RandomForest/T35 ... -43.6195 = Validation score (-root_mean_squared_error) 23.81s = Training runtime 1.26s = Validation runtime Fitted model: RandomForest/T36 ... -42.3154 = Validation score (-root_mean_squared_error) 31.45s = Training runtime 1.07s = Validation runtime Fitted model: RandomForest/T37 ... -103.0633 = Validation score (-root_mean_squared_error) 8.02s = Training runtime 1.09s = Validation runtime Fitted model: RandomForest/T38 ... -41.4302 = Validation score (-root_mean_squared_error) 18.76s = Training runtime 0.59s = Validation runtime Fitted model: RandomForest/T39 ... -42.3592 = Validation score (-root_mean_squared_error) 26.81s = Training runtime 0.77s = Validation runtime Fitted model: RandomForest/T40 ... -103.0082 = Validation score (-root_mean_squared_error) 14.82s = Training runtime 0.92s = Validation runtime Fitted model: RandomForest/T41 ... -50.5195 = Validation score (-root_mean_squared_error) 21.97s = Training runtime 1.26s = Validation runtime Fitted model: RandomForest/T42 ... -41.1179 = Validation score (-root_mean_squared_error) 23.94s = Training runtime 1.09s = Validation runtime Fitted model: RandomForest/T43 ... -40.4416 = Validation score (-root_mean_squared_error) 9.91s = Training runtime 0.41s = Validation runtime Fitted model: RandomForest/T44 ... -39.8632 = Validation score (-root_mean_squared_error) 23.48s = Training runtime 0.73s = Validation runtime Fitted model: RandomForest/T45 ... -43.599 = Validation score (-root_mean_squared_error) 24.13s = Training runtime 0.87s = Validation runtime Fitted model: RandomForest/T46 ... -71.8269 = Validation score (-root_mean_squared_error) 5.26s = Training runtime 0.35s = Validation runtime Fitted model: RandomForest/T47 ... -41.0848 = Validation score (-root_mean_squared_error) 29.0s = Training runtime 1.39s = Validation runtime Fitted model: RandomForest/T48 ... -41.1881 = Validation score (-root_mean_squared_error) 23.23s = Training runtime 1.1s = Validation runtime Fitted model: RandomForest/T49 ... -41.3613 = Validation score (-root_mean_squared_error) 7.5s = Training runtime 0.3s = Validation runtime Fitted model: RandomForest/T50 ... -89.8729 = Validation score (-root_mean_squared_error) 7.95s = Training runtime 0.66s = Validation runtime Hyperparameter tuning model: XGBoost ...
Fitted model: XGBoost/T1 ... -242.2085 = Validation score (-root_mean_squared_error) 1.97s = Training runtime 0.12s = Validation runtime Fitted model: XGBoost/T2 ... -57.6242 = Validation score (-root_mean_squared_error) 4.51s = Training runtime 0.46s = Validation runtime Fitted model: XGBoost/T3 ... -99.0782 = Validation score (-root_mean_squared_error) 3.23s = Training runtime 0.34s = Validation runtime Fitted model: XGBoost/T4 ... -39.3426 = Validation score (-root_mean_squared_error) 20.52s = Training runtime 0.43s = Validation runtime Fitted model: XGBoost/T5 ... -139.0641 = Validation score (-root_mean_squared_error) 2.48s = Training runtime 0.16s = Validation runtime Fitted model: XGBoost/T6 ... -102.8627 = Validation score (-root_mean_squared_error) 2.15s = Training runtime 0.16s = Validation runtime Fitted model: XGBoost/T7 ... -82.8919 = Validation score (-root_mean_squared_error) 1.46s = Training runtime 0.08s = Validation runtime Fitted model: XGBoost/T8 ... -88.0046 = Validation score (-root_mean_squared_error) 10.59s = Training runtime 0.2s = Validation runtime Fitted model: XGBoost/T9 ... -128.0395 = Validation score (-root_mean_squared_error) 2.86s = Training runtime 0.08s = Validation runtime Fitted model: XGBoost/T10 ... -44.1355 = Validation score (-root_mean_squared_error) 11.45s = Training runtime 1.0s = Validation runtime Fitted model: XGBoost/T11 ... -192.5749 = Validation score (-root_mean_squared_error) 8.36s = Training runtime 0.39s = Validation runtime Fitted model: XGBoost/T12 ... -39.991 = Validation score (-root_mean_squared_error) 15.8s = Training runtime 0.95s = Validation runtime Fitted model: XGBoost/T13 ... -115.6039 = Validation score (-root_mean_squared_error) 6.98s = Training runtime 0.18s = Validation runtime Fitted model: XGBoost/T14 ... -219.4567 = Validation score (-root_mean_squared_error) 3.27s = Training runtime 0.23s = Validation runtime Fitted model: XGBoost/T15 ... -243.8436 = Validation score (-root_mean_squared_error) 2.06s = Training runtime 0.07s = Validation runtime Fitted model: XGBoost/T16 ... -59.8269 = Validation score (-root_mean_squared_error) 4.07s = Training runtime 0.18s = Validation runtime Fitted model: XGBoost/T17 ... -170.9189 = Validation score (-root_mean_squared_error) 4.62s = Training runtime 0.18s = Validation runtime Fitted model: XGBoost/T18 ... -223.1008 = Validation score (-root_mean_squared_error) 4.32s = Training runtime 0.33s = Validation runtime Fitted model: XGBoost/T19 ... -39.5308 = Validation score (-root_mean_squared_error) 62.65s = Training runtime 2.56s = Validation runtime Fitted model: XGBoost/T20 ... -239.9045 = Validation score (-root_mean_squared_error) 8.95s = Training runtime 0.39s = Validation runtime Fitted model: XGBoost/T21 ... -251.4065 = Validation score (-root_mean_squared_error) 4.02s = Training runtime 0.2s = Validation runtime Fitted model: XGBoost/T22 ... -247.5726 = Validation score (-root_mean_squared_error) 2.84s = Training runtime 0.14s = Validation runtime Fitted model: XGBoost/T23 ... -38.9507 = Validation score (-root_mean_squared_error) 17.22s = Training runtime 0.54s = Validation runtime Fitted model: XGBoost/T24 ... -40.5635 = Validation score (-root_mean_squared_error) 2.87s = Training runtime 0.23s = Validation runtime Fitted model: XGBoost/T25 ... -90.8502 = Validation score (-root_mean_squared_error) 17.27s = Training runtime 1.19s = Validation runtime Fitted model: XGBoost/T26 ... -232.6079 = Validation score (-root_mean_squared_error) 3.47s = Training runtime 0.17s = Validation runtime Fitted model: XGBoost/T27 ... -127.4919 = Validation score (-root_mean_squared_error) 8.51s = Training runtime 0.53s = Validation runtime Fitted model: XGBoost/T28 ... -257.4382 = Validation score (-root_mean_squared_error) 0.99s = Training runtime 0.08s = Validation runtime Fitted model: XGBoost/T29 ... -109.1925 = Validation score (-root_mean_squared_error) 2.54s = Training runtime 0.12s = Validation runtime Fitted model: XGBoost/T30 ... -47.4733 = Validation score (-root_mean_squared_error) 6.52s = Training runtime 0.22s = Validation runtime Fitted model: XGBoost/T31 ... -145.607 = Validation score (-root_mean_squared_error) 4.9s = Training runtime 0.29s = Validation runtime Fitted model: XGBoost/T32 ... -50.6531 = Validation score (-root_mean_squared_error) 15.0s = Training runtime 0.57s = Validation runtime Fitted model: XGBoost/T33 ... -216.1679 = Validation score (-root_mean_squared_error) 8.49s = Training runtime 0.28s = Validation runtime Fitted model: XGBoost/T34 ... -132.0429 = Validation score (-root_mean_squared_error) 12.43s = Training runtime 0.72s = Validation runtime Fitted model: XGBoost/T35 ... -140.042 = Validation score (-root_mean_squared_error) 6.63s = Training runtime 0.19s = Validation runtime Fitted model: XGBoost/T36 ... -84.1901 = Validation score (-root_mean_squared_error) 14.91s = Training runtime 0.9s = Validation runtime Fitted model: XGBoost/T37 ... -73.8061 = Validation score (-root_mean_squared_error) 3.94s = Training runtime 0.13s = Validation runtime Fitted model: XGBoost/T38 ... -171.9386 = Validation score (-root_mean_squared_error) 3.51s = Training runtime 0.11s = Validation runtime Fitted model: XGBoost/T39 ... -212.4023 = Validation score (-root_mean_squared_error) 4.03s = Training runtime 0.24s = Validation runtime Fitted model: XGBoost/T40 ... -100.5063 = Validation score (-root_mean_squared_error) 2.69s = Training runtime 0.12s = Validation runtime Fitted model: XGBoost/T41 ... -212.045 = Validation score (-root_mean_squared_error) 5.55s = Training runtime 0.38s = Validation runtime Fitted model: XGBoost/T42 ... -80.551 = Validation score (-root_mean_squared_error) 1.41s = Training runtime 0.1s = Validation runtime Fitted model: XGBoost/T43 ... -45.4968 = Validation score (-root_mean_squared_error) 3.82s = Training runtime 0.21s = Validation runtime Fitted model: XGBoost/T44 ... -39.1383 = Validation score (-root_mean_squared_error) 11.22s = Training runtime 1.13s = Validation runtime Fitted model: XGBoost/T45 ... -45.6811 = Validation score (-root_mean_squared_error) 14.71s = Training runtime 0.35s = Validation runtime Fitted model: XGBoost/T46 ... -174.9311 = Validation score (-root_mean_squared_error) 2.42s = Training runtime 0.11s = Validation runtime Fitted model: XGBoost/T47 ... -131.6406 = Validation score (-root_mean_squared_error) 9.52s = Training runtime 0.89s = Validation runtime Fitted model: XGBoost/T48 ... -63.4272 = Validation score (-root_mean_squared_error) 5.23s = Training runtime 0.22s = Validation runtime Fitted model: XGBoost/T49 ... -254.3377 = Validation score (-root_mean_squared_error) 1.16s = Training runtime 0.04s = Validation runtime Fitted model: XGBoost/T50 ... -134.6041 = Validation score (-root_mean_squared_error) 0.95s = Training runtime 0.04s = Validation runtime Hyperparameter tuning model: NeuralNetMXNet ...
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
Traceback (most recent call last):
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 43, in model_trial
model = fit_and_save_model(
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/model_trial.py", line 101, in fit_and_save_model
model.fit(**fit_args, time_limit=time_left)
File "/usr/local/lib/python3.10/dist-packages/autogluon/core/models/abstract/abstract_model.py", line 703, in fit
out = self._fit(**kwargs)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 156, in _fit
train_dataset, val_dataset = self.generate_datasets(X=X, y=y, params=params, X_val=X_val, y_val=y_val)
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 446, in generate_datasets
train_dataset = self.process_train_data(
File "/usr/local/lib/python3.10/dist-packages/autogluon/tabular/models/tabular_nn/mxnet/tabular_nn_mxnet.py", line 511, in process_train_data
df = self.processor.fit_transform(df) # 2D numpy array
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 727, in fit_transform
result = self._fit_transform(X, y, _fit_transform_one)
File "/usr/local/lib/python3.10/dist-packages/sklearn/compose/_column_transformer.py", line 658, in _fit_transform
return Parallel(n_jobs=self.n_jobs)(
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 63, in __call__
return super().__call__(iterable_with_config)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 1088, in __call__
while self.dispatch_one_batch(iterator):
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 901, in dispatch_one_batch
self._dispatch(tasks)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 819, in _dispatch
job = self._backend.apply_async(batch, callback=cb)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 208, in apply_async
result = ImmediateResult(func)
File "/usr/local/lib/python3.10/dist-packages/joblib/_parallel_backends.py", line 597, in __init__
self.results = batch()
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in __call__
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/joblib/parallel.py", line 288, in <listcomp>
return [func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/parallel.py", line 123, in __call__
return self.function(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 437, in fit_transform
Xt = self._fit(X, y, **fit_params_steps)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 359, in _fit
X, fitted_transformer = fit_transform_one_cached(
File "/usr/local/lib/python3.10/dist-packages/joblib/memory.py", line 349, in __call__
return self.func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/pipeline.py", line 893, in _fit_transform_one
res = transformer.fit_transform(X, y, **fit_params)
File "/usr/local/lib/python3.10/dist-packages/sklearn/utils/_set_output.py", line 140, in wrapped
data_to_wrap = f(self, X, *args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/sklearn/base.py", line 878, in fit_transform
return self.fit(X, **fit_params).transform(X)
File "/usr/local/lib/python3.10/dist-packages/sklearn/impute/_base.py", line 408, in fit
raise ValueError(
ValueError: 'fill_value'=!missing! is invalid. Expected a numerical value when imputing numerical data
No model was trained during hyperparameter tuning NeuralNetMXNet... Skipping this model.
Fitting model: WeightedEnsemble_L2 ...
-34.8981 = Validation score (-root_mean_squared_error)
0.48s = Training runtime
0.0s = Validation runtime
AutoGluon training complete, total runtime = 1396.69s ... Best model: "WeightedEnsemble_L2"
TabularPredictor saved. To load, use: predictor = TabularPredictor.load("/content/drive/MyDrive/")
<autogluon.tabular.predictor.predictor.TabularPredictor at 0x7f2d998337f0>
predictor_new_hpo1.fit_summary()
*** Summary of fit() ***
Estimated performance of each model:
model score_val pred_time_val fit_time pred_time_val_marginal fit_time_marginal stack_level can_infer fit_order
0 WeightedEnsemble_L2 -34.898059 5.002062 111.771389 0.000491 0.481708 2 True 151
1 LightGBM/T47 -35.924928 0.042139 1.587203 0.042139 1.587203 1 True 47
2 LightGBM/T2 -35.931911 0.077495 2.277196 0.077495 2.277196 1 True 2
3 LightGBM/T7 -36.169756 0.039169 0.964131 0.039169 0.964131 1 True 7
4 LightGBM/T8 -36.190762 0.139982 1.775807 0.139982 1.775807 1 True 8
5 LightGBM/T12 -36.345767 0.089327 1.088938 0.089327 1.088938 1 True 12
6 LightGBM/T13 -36.346004 0.100994 1.418601 0.100994 1.418601 1 True 13
7 LightGBM/T38 -36.446717 0.134818 1.826957 0.134818 1.826957 1 True 38
8 LightGBM/T3 -36.471329 0.154006 2.176134 0.154006 2.176134 1 True 3
9 LightGBM/T34 -36.474047 0.090220 1.332995 0.090220 1.332995 1 True 34
10 LightGBM/T6 -36.579444 0.091999 1.532857 0.091999 1.532857 1 True 6
11 LightGBM/T20 -36.621747 0.198676 2.033289 0.198676 2.033289 1 True 20
12 LightGBM/T50 -36.717024 0.068625 1.083027 0.068625 1.083027 1 True 50
13 LightGBM/T42 -36.794523 0.022964 0.751100 0.022964 0.751100 1 True 42
14 LightGBM/T32 -36.848336 0.030264 0.602236 0.030264 0.602236 1 True 32
15 LightGBM/T36 -36.863287 0.058409 0.847193 0.058409 0.847193 1 True 36
16 LightGBM/T19 -36.934511 0.097137 2.109204 0.097137 2.109204 1 True 19
17 LightGBM/T30 -36.948993 0.050374 0.808811 0.050374 0.808811 1 True 30
18 LightGBM/T41 -37.004238 0.060049 1.020494 0.060049 1.020494 1 True 41
19 LightGBM/T46 -37.009869 0.070575 1.171697 0.070575 1.171697 1 True 46
20 LightGBM/T40 -37.012254 0.052354 1.208532 0.052354 1.208532 1 True 40
21 LightGBM/T15 -37.076249 0.186822 1.800417 0.186822 1.800417 1 True 15
22 LightGBM/T43 -37.119386 0.059576 1.121761 0.059576 1.121761 1 True 43
23 LightGBM/T39 -37.386614 0.097385 1.828162 0.097385 1.828162 1 True 39
24 LightGBM/T35 -37.398808 0.075921 1.120193 0.075921 1.120193 1 True 35
25 LightGBM/T37 -37.413602 0.051133 1.208472 0.051133 1.208472 1 True 37
26 LightGBM/T4 -37.444609 0.040280 1.229091 0.040280 1.229091 1 True 4
27 LightGBM/T33 -37.476003 0.040293 0.611915 0.040293 0.611915 1 True 33
28 LightGBM/T10 -37.489060 0.092083 1.317526 0.092083 1.317526 1 True 10
29 LightGBM/T27 -37.765247 0.044925 0.736607 0.044925 0.736607 1 True 27
30 LightGBM/T49 -38.145114 0.060162 0.796947 0.060162 0.796947 1 True 49
31 LightGBM/T21 -38.263158 0.098881 1.607171 0.098881 1.607171 1 True 21
32 XGBoost/T23 -38.950703 0.538791 17.224653 0.538791 17.224653 1 True 123
33 XGBoost/T44 -39.138287 1.134258 11.221674 1.134258 11.221674 1 True 144
34 LightGBM/T17 -39.281470 0.037610 0.612743 0.037610 0.612743 1 True 17
35 XGBoost/T4 -39.342622 0.431099 20.521073 0.431099 20.521073 1 True 104
36 LightGBM/T29 -39.428372 0.049327 0.634152 0.049327 0.634152 1 True 29
37 XGBoost/T19 -39.530804 2.557139 62.648386 2.557139 62.648386 1 True 119
38 LightGBM/T18 -39.538599 0.133437 2.640699 0.133437 2.640699 1 True 18
39 RandomForest/T17 -39.837432 1.011391 30.815284 1.011391 30.815284 1 True 67
40 RandomForest/T44 -39.863200 0.730638 23.475662 0.730638 23.475662 1 True 94
41 XGBoost/T12 -39.990974 0.945994 15.796385 0.945994 15.796385 1 True 112
42 RandomForest/T7 -40.119442 0.749774 28.198174 0.749774 28.198174 1 True 57
43 RandomForest/T43 -40.441578 0.410259 9.908037 0.410259 9.908037 1 True 93
44 XGBoost/T24 -40.563502 0.226020 2.870816 0.226020 2.870816 1 True 124
45 LightGBM/T1 -40.640459 0.024151 1.530417 0.024151 1.530417 1 True 1
46 RandomForest/T20 -40.721254 0.360661 10.965287 0.360661 10.965287 1 True 70
47 LightGBM/T23 -40.726407 0.043520 1.053968 0.043520 1.053968 1 True 23
48 LightGBM/T28 -40.754222 0.028372 0.501677 0.028372 0.501677 1 True 28
49 RandomForest/T28 -40.868729 0.335821 9.019820 0.335821 9.019820 1 True 78
50 LightGBM/T16 -40.998659 0.024321 0.481396 0.024321 0.481396 1 True 16
51 RandomForest/T9 -41.027795 0.925918 28.136793 0.925918 28.136793 1 True 59
52 RandomForest/T47 -41.084779 1.387164 28.997343 1.387164 28.997343 1 True 97
53 RandomForest/T42 -41.117913 1.090776 23.935501 1.090776 23.935501 1 True 92
54 RandomForest/T48 -41.188075 1.100142 23.230534 1.100142 23.230534 1 True 98
55 RandomForest/T49 -41.361321 0.301695 7.503019 0.301695 7.503019 1 True 99
56 LightGBM/T25 -41.416426 0.026030 0.670193 0.026030 0.670193 1 True 25
57 RandomForest/T38 -41.430239 0.592785 18.764225 0.592785 18.764225 1 True 88
58 RandomForest/T4 -41.480397 0.595626 15.639352 0.595626 15.639352 1 True 54
59 RandomForest/T31 -41.731030 1.398465 29.855940 1.398465 29.855940 1 True 81
60 RandomForest/T33 -41.880798 0.308373 6.884171 0.308373 6.884171 1 True 83
61 RandomForest/T18 -42.001104 0.369319 6.784109 0.369319 6.784109 1 True 68
62 LightGBM/T48 -42.124380 0.073261 1.099098 0.073261 1.099098 1 True 48
63 RandomForest/T34 -42.301556 0.517707 12.005346 0.517707 12.005346 1 True 84
64 RandomForest/T36 -42.315448 1.065865 31.449131 1.065865 31.449131 1 True 86
65 RandomForest/T39 -42.359156 0.766997 26.805087 0.766997 26.805087 1 True 89
66 RandomForest/T2 -42.508316 0.306799 12.639431 0.306799 12.639431 1 True 52
67 LightGBM/T5 -42.567712 0.030243 0.757295 0.030243 0.757295 1 True 5
68 RandomForest/T45 -43.599031 0.870358 24.132289 0.870358 24.132289 1 True 95
69 RandomForest/T35 -43.619516 1.261722 23.809182 1.261722 23.809182 1 True 85
70 RandomForest/T15 -43.623447 0.408664 10.686238 0.408664 10.686238 1 True 65
71 XGBoost/T10 -44.135475 1.002623 11.449515 1.002623 11.449515 1 True 110
72 RandomForest/T10 -44.300082 0.942840 26.285034 0.942840 26.285034 1 True 60
73 RandomForest/T25 -44.315262 0.359177 9.838387 0.359177 9.838387 1 True 75
74 RandomForest/T8 -44.334141 0.730630 10.975008 0.730630 10.975008 1 True 58
75 LightGBM/T45 -44.482170 0.078983 1.162269 0.078983 1.162269 1 True 45
76 RandomForest/T30 -45.199040 0.234818 3.974370 0.234818 3.974370 1 True 80
77 RandomForest/T6 -45.484220 0.686022 20.094044 0.686022 20.094044 1 True 56
78 XGBoost/T43 -45.496827 0.207337 3.818314 0.207337 3.818314 1 True 143
79 XGBoost/T45 -45.681115 0.348861 14.714322 0.348861 14.714322 1 True 145
80 RandomForest/T12 -45.842614 0.935500 27.344128 0.935500 27.344128 1 True 62
81 LightGBM/T31 -46.041567 0.056506 0.837871 0.056506 0.837871 1 True 31
82 RandomForest/T1 -46.135598 0.317881 5.272322 0.317881 5.272322 1 True 51
83 RandomForest/T22 -46.489837 0.560778 17.739965 0.560778 17.739965 1 True 72
84 LightGBM/T44 -47.031701 0.029698 0.825087 0.029698 0.825087 1 True 44
85 XGBoost/T30 -47.473306 0.216271 6.524850 0.216271 6.524850 1 True 130
86 LightGBM/T14 -48.915721 0.019154 0.446468 0.019154 0.446468 1 True 14
87 LightGBM/T24 -49.549213 0.110660 1.855048 0.110660 1.855048 1 True 24
88 RandomForest/T41 -50.519510 1.262401 21.966599 1.262401 21.966599 1 True 91
89 RandomForest/T11 -50.593850 0.851313 22.914006 0.851313 22.914006 1 True 61
90 XGBoost/T32 -50.653096 0.570270 14.999533 0.570270 14.999533 1 True 132
91 RandomForest/T23 -50.758461 0.635774 10.096414 0.635774 10.096414 1 True 73
92 RandomForest/T32 -50.806859 0.320406 7.246171 0.320406 7.246171 1 True 82
93 RandomForest/T24 -51.689472 0.194921 4.444705 0.194921 4.444705 1 True 74
94 XGBoost/T2 -57.624184 0.460748 4.508176 0.460748 4.508176 1 True 102
95 RandomForest/T3 -59.178067 0.451391 7.502267 0.451391 7.502267 1 True 53
96 RandomForest/T27 -59.189924 0.326075 4.416137 0.326075 4.416137 1 True 77
97 RandomForest/T19 -59.310795 0.385784 10.027472 0.385784 10.027472 1 True 69
98 RandomForest/T14 -59.324112 0.343817 7.884783 0.343817 7.884783 1 True 64
99 RandomForest/T13 -59.425636 0.375960 5.079821 0.375960 5.079821 1 True 63
100 XGBoost/T16 -59.826946 0.178328 4.072571 0.178328 4.072571 1 True 116
101 XGBoost/T48 -63.427190 0.220216 5.230167 0.220216 5.230167 1 True 148
102 RandomForest/T5 -71.457990 0.928351 10.296333 0.928351 10.296333 1 True 55
103 RandomForest/T46 -71.826852 0.352957 5.259775 0.352957 5.259775 1 True 96
104 XGBoost/T37 -73.806067 0.130485 3.942267 0.130485 3.942267 1 True 137
105 XGBoost/T42 -80.551013 0.096723 1.414050 0.096723 1.414050 1 True 142
106 XGBoost/T7 -82.891916 0.080611 1.458630 0.080611 1.458630 1 True 107
107 XGBoost/T36 -84.190059 0.898886 14.910848 0.898886 14.910848 1 True 136
108 XGBoost/T8 -88.004626 0.200919 10.592932 0.200919 10.592932 1 True 108
109 RandomForest/T16 -89.496149 0.993723 14.863109 0.993723 14.863109 1 True 66
110 RandomForest/T26 -89.841276 0.133373 1.988868 0.133373 1.988868 1 True 76
111 RandomForest/T50 -89.872904 0.659021 7.945055 0.659021 7.945055 1 True 100
112 XGBoost/T25 -90.850225 1.191672 17.270076 1.191672 17.270076 1 True 125
113 XGBoost/T3 -99.078205 0.339471 3.234742 0.339471 3.234742 1 True 103
114 XGBoost/T40 -100.506268 0.123123 2.687712 0.123123 2.687712 1 True 140
115 LightGBM/T9 -102.754223 0.016725 0.522563 0.016725 0.522563 1 True 9
116 XGBoost/T6 -102.862685 0.158870 2.149665 0.158870 2.149665 1 True 106
117 RandomForest/T40 -103.008167 0.916565 14.819585 0.916565 14.819585 1 True 90
118 RandomForest/T37 -103.063336 1.090327 8.024420 1.090327 8.024420 1 True 87
119 RandomForest/T29 -103.071616 0.547693 8.473233 0.547693 8.473233 1 True 79
120 RandomForest/T21 -103.113071 0.252057 3.261698 0.252057 3.261698 1 True 71
121 XGBoost/T29 -109.192454 0.122225 2.541615 0.122225 2.541615 1 True 129
122 XGBoost/T13 -115.603927 0.175748 6.975407 0.175748 6.975407 1 True 113
123 LightGBM/T11 -116.370115 0.012155 0.407034 0.012155 0.407034 1 True 11
124 LightGBM/T22 -116.568016 0.021316 0.829031 0.021316 0.829031 1 True 22
125 LightGBM/T26 -119.006822 0.008404 0.351193 0.008404 0.351193 1 True 26
126 XGBoost/T27 -127.491941 0.533416 8.505639 0.533416 8.505639 1 True 127
127 XGBoost/T9 -128.039452 0.083737 2.859755 0.083737 2.859755 1 True 109
128 XGBoost/T47 -131.640625 0.886742 9.522579 0.886742 9.522579 1 True 147
129 XGBoost/T34 -132.042864 0.721215 12.433583 0.721215 12.433583 1 True 134
130 XGBoost/T50 -134.604095 0.035108 0.946643 0.035108 0.946643 1 True 150
131 XGBoost/T5 -139.064087 0.159502 2.476526 0.159502 2.476526 1 True 105
132 XGBoost/T35 -140.041969 0.192049 6.630892 0.192049 6.630892 1 True 135
133 XGBoost/T31 -145.607019 0.292085 4.900802 0.292085 4.900802 1 True 131
134 XGBoost/T17 -170.918864 0.175920 4.615532 0.175920 4.615532 1 True 117
135 XGBoost/T38 -171.938590 0.109776 3.509191 0.109776 3.509191 1 True 138
136 XGBoost/T46 -174.931098 0.111699 2.423919 0.111699 2.423919 1 True 146
137 XGBoost/T11 -192.574903 0.386929 8.360250 0.386929 8.360250 1 True 111
138 XGBoost/T41 -212.045013 0.376342 5.549246 0.376342 5.549246 1 True 141
139 XGBoost/T39 -212.402290 0.238479 4.034522 0.238479 4.034522 1 True 139
140 XGBoost/T33 -216.167935 0.276656 8.485581 0.276656 8.485581 1 True 133
141 XGBoost/T14 -219.456741 0.234087 3.268717 0.234087 3.268717 1 True 114
142 XGBoost/T18 -223.100827 0.329863 4.324323 0.329863 4.324323 1 True 118
143 XGBoost/T26 -232.607937 0.171281 3.473790 0.171281 3.473790 1 True 126
144 XGBoost/T20 -239.904515 0.389431 8.946117 0.389431 8.946117 1 True 120
145 XGBoost/T1 -242.208453 0.117050 1.973424 0.117050 1.973424 1 True 101
146 XGBoost/T15 -243.843603 0.068631 2.060071 0.068631 2.060071 1 True 115
147 XGBoost/T22 -247.572636 0.142093 2.837742 0.142093 2.837742 1 True 122
148 XGBoost/T21 -251.406485 0.198672 4.017972 0.198672 4.017972 1 True 121
149 XGBoost/T49 -254.337702 0.041091 1.164598 0.041091 1.164598 1 True 149
150 XGBoost/T28 -257.438227 0.080407 0.987807 0.080407 0.987807 1 True 128
Number of models trained: 151
Types of models trained:
{'XGBoostModel', 'LGBModel', 'WeightedEnsembleModel', 'RFModel'}
Bagging used: False
Multi-layer stack-ensembling used: False
Feature Metadata (Processed):
(raw dtype, special dtypes):
('category', []) : 2 | ['season', 'weather']
('float', []) : 4 | ['temp', 'atemp', 'windspeed', 'temp_humidity']
('int', []) : 3 | ['humidity', 'hour', 'hour_squared']
('int', ['bool']) : 2 | ['holiday', 'workingday']
('int', ['datetime_as_int']) : 5 | ['datetime', 'datetime.year', 'datetime.month', 'datetime.day', 'datetime.dayofweek']
*** End of fit() summary ***
/usr/local/lib/python3.10/dist-packages/autogluon/core/utils/plots.py:138: UserWarning: AutoGluon summary plots cannot be created because bokeh is not installed. To see plots, please do: "pip install bokeh==2.0.1"
warnings.warn('AutoGluon summary plots cannot be created because bokeh is not installed. To see plots, please do: "pip install bokeh==2.0.1"')
{'model_types': {'LightGBM/T1': 'LGBModel',
'LightGBM/T2': 'LGBModel',
'LightGBM/T3': 'LGBModel',
'LightGBM/T4': 'LGBModel',
'LightGBM/T5': 'LGBModel',
'LightGBM/T6': 'LGBModel',
'LightGBM/T7': 'LGBModel',
'LightGBM/T8': 'LGBModel',
'LightGBM/T9': 'LGBModel',
'LightGBM/T10': 'LGBModel',
'LightGBM/T11': 'LGBModel',
'LightGBM/T12': 'LGBModel',
'LightGBM/T13': 'LGBModel',
'LightGBM/T14': 'LGBModel',
'LightGBM/T15': 'LGBModel',
'LightGBM/T16': 'LGBModel',
'LightGBM/T17': 'LGBModel',
'LightGBM/T18': 'LGBModel',
'LightGBM/T19': 'LGBModel',
'LightGBM/T20': 'LGBModel',
'LightGBM/T21': 'LGBModel',
'LightGBM/T22': 'LGBModel',
'LightGBM/T23': 'LGBModel',
'LightGBM/T24': 'LGBModel',
'LightGBM/T25': 'LGBModel',
'LightGBM/T26': 'LGBModel',
'LightGBM/T27': 'LGBModel',
'LightGBM/T28': 'LGBModel',
'LightGBM/T29': 'LGBModel',
'LightGBM/T30': 'LGBModel',
'LightGBM/T31': 'LGBModel',
'LightGBM/T32': 'LGBModel',
'LightGBM/T33': 'LGBModel',
'LightGBM/T34': 'LGBModel',
'LightGBM/T35': 'LGBModel',
'LightGBM/T36': 'LGBModel',
'LightGBM/T37': 'LGBModel',
'LightGBM/T38': 'LGBModel',
'LightGBM/T39': 'LGBModel',
'LightGBM/T40': 'LGBModel',
'LightGBM/T41': 'LGBModel',
'LightGBM/T42': 'LGBModel',
'LightGBM/T43': 'LGBModel',
'LightGBM/T44': 'LGBModel',
'LightGBM/T45': 'LGBModel',
'LightGBM/T46': 'LGBModel',
'LightGBM/T47': 'LGBModel',
'LightGBM/T48': 'LGBModel',
'LightGBM/T49': 'LGBModel',
'LightGBM/T50': 'LGBModel',
'RandomForest/T1': 'RFModel',
'RandomForest/T2': 'RFModel',
'RandomForest/T3': 'RFModel',
'RandomForest/T4': 'RFModel',
'RandomForest/T5': 'RFModel',
'RandomForest/T6': 'RFModel',
'RandomForest/T7': 'RFModel',
'RandomForest/T8': 'RFModel',
'RandomForest/T9': 'RFModel',
'RandomForest/T10': 'RFModel',
'RandomForest/T11': 'RFModel',
'RandomForest/T12': 'RFModel',
'RandomForest/T13': 'RFModel',
'RandomForest/T14': 'RFModel',
'RandomForest/T15': 'RFModel',
'RandomForest/T16': 'RFModel',
'RandomForest/T17': 'RFModel',
'RandomForest/T18': 'RFModel',
'RandomForest/T19': 'RFModel',
'RandomForest/T20': 'RFModel',
'RandomForest/T21': 'RFModel',
'RandomForest/T22': 'RFModel',
'RandomForest/T23': 'RFModel',
'RandomForest/T24': 'RFModel',
'RandomForest/T25': 'RFModel',
'RandomForest/T26': 'RFModel',
'RandomForest/T27': 'RFModel',
'RandomForest/T28': 'RFModel',
'RandomForest/T29': 'RFModel',
'RandomForest/T30': 'RFModel',
'RandomForest/T31': 'RFModel',
'RandomForest/T32': 'RFModel',
'RandomForest/T33': 'RFModel',
'RandomForest/T34': 'RFModel',
'RandomForest/T35': 'RFModel',
'RandomForest/T36': 'RFModel',
'RandomForest/T37': 'RFModel',
'RandomForest/T38': 'RFModel',
'RandomForest/T39': 'RFModel',
'RandomForest/T40': 'RFModel',
'RandomForest/T41': 'RFModel',
'RandomForest/T42': 'RFModel',
'RandomForest/T43': 'RFModel',
'RandomForest/T44': 'RFModel',
'RandomForest/T45': 'RFModel',
'RandomForest/T46': 'RFModel',
'RandomForest/T47': 'RFModel',
'RandomForest/T48': 'RFModel',
'RandomForest/T49': 'RFModel',
'RandomForest/T50': 'RFModel',
'XGBoost/T1': 'XGBoostModel',
'XGBoost/T2': 'XGBoostModel',
'XGBoost/T3': 'XGBoostModel',
'XGBoost/T4': 'XGBoostModel',
'XGBoost/T5': 'XGBoostModel',
'XGBoost/T6': 'XGBoostModel',
'XGBoost/T7': 'XGBoostModel',
'XGBoost/T8': 'XGBoostModel',
'XGBoost/T9': 'XGBoostModel',
'XGBoost/T10': 'XGBoostModel',
'XGBoost/T11': 'XGBoostModel',
'XGBoost/T12': 'XGBoostModel',
'XGBoost/T13': 'XGBoostModel',
'XGBoost/T14': 'XGBoostModel',
'XGBoost/T15': 'XGBoostModel',
'XGBoost/T16': 'XGBoostModel',
'XGBoost/T17': 'XGBoostModel',
'XGBoost/T18': 'XGBoostModel',
'XGBoost/T19': 'XGBoostModel',
'XGBoost/T20': 'XGBoostModel',
'XGBoost/T21': 'XGBoostModel',
'XGBoost/T22': 'XGBoostModel',
'XGBoost/T23': 'XGBoostModel',
'XGBoost/T24': 'XGBoostModel',
'XGBoost/T25': 'XGBoostModel',
'XGBoost/T26': 'XGBoostModel',
'XGBoost/T27': 'XGBoostModel',
'XGBoost/T28': 'XGBoostModel',
'XGBoost/T29': 'XGBoostModel',
'XGBoost/T30': 'XGBoostModel',
'XGBoost/T31': 'XGBoostModel',
'XGBoost/T32': 'XGBoostModel',
'XGBoost/T33': 'XGBoostModel',
'XGBoost/T34': 'XGBoostModel',
'XGBoost/T35': 'XGBoostModel',
'XGBoost/T36': 'XGBoostModel',
'XGBoost/T37': 'XGBoostModel',
'XGBoost/T38': 'XGBoostModel',
'XGBoost/T39': 'XGBoostModel',
'XGBoost/T40': 'XGBoostModel',
'XGBoost/T41': 'XGBoostModel',
'XGBoost/T42': 'XGBoostModel',
'XGBoost/T43': 'XGBoostModel',
'XGBoost/T44': 'XGBoostModel',
'XGBoost/T45': 'XGBoostModel',
'XGBoost/T46': 'XGBoostModel',
'XGBoost/T47': 'XGBoostModel',
'XGBoost/T48': 'XGBoostModel',
'XGBoost/T49': 'XGBoostModel',
'XGBoost/T50': 'XGBoostModel',
'WeightedEnsemble_L2': 'WeightedEnsembleModel'},
'model_performance': {'LightGBM/T1': -40.640459292130764,
'LightGBM/T2': -35.93191132350938,
'LightGBM/T3': -36.471329071087446,
'LightGBM/T4': -37.444609291398486,
'LightGBM/T5': -42.567711639714496,
'LightGBM/T6': -36.579443541142844,
'LightGBM/T7': -36.169756355107026,
'LightGBM/T8': -36.19076194913071,
'LightGBM/T9': -102.75422286974772,
'LightGBM/T10': -37.489060348003235,
'LightGBM/T11': -116.37011468945532,
'LightGBM/T12': -36.345767204602915,
'LightGBM/T13': -36.34600351915444,
'LightGBM/T14': -48.915721313866236,
'LightGBM/T15': -37.076249045758956,
'LightGBM/T16': -40.99865855005108,
'LightGBM/T17': -39.281470094915576,
'LightGBM/T18': -39.53859904148371,
'LightGBM/T19': -36.93451119838592,
'LightGBM/T20': -36.62174738712767,
'LightGBM/T21': -38.26315786648141,
'LightGBM/T22': -116.56801570276748,
'LightGBM/T23': -40.726407467265766,
'LightGBM/T24': -49.54921332687515,
'LightGBM/T25': -41.41642642137957,
'LightGBM/T26': -119.00682224728259,
'LightGBM/T27': -37.76524746528299,
'LightGBM/T28': -40.75422220410477,
'LightGBM/T29': -39.428372252268055,
'LightGBM/T30': -36.94899347157754,
'LightGBM/T31': -46.04156715551054,
'LightGBM/T32': -36.84833569574435,
'LightGBM/T33': -37.476003495152874,
'LightGBM/T34': -36.47404707987454,
'LightGBM/T35': -37.39880821966318,
'LightGBM/T36': -36.86328708394059,
'LightGBM/T37': -37.413601957204676,
'LightGBM/T38': -36.44671748938538,
'LightGBM/T39': -37.38661422079278,
'LightGBM/T40': -37.012253657626054,
'LightGBM/T41': -37.00423792316174,
'LightGBM/T42': -36.79452321625195,
'LightGBM/T43': -37.119385713242806,
'LightGBM/T44': -47.031701433760375,
'LightGBM/T45': -44.48216970331666,
'LightGBM/T46': -37.00986860458107,
'LightGBM/T47': -35.92492835530276,
'LightGBM/T48': -42.12438015298177,
'LightGBM/T49': -38.145114067020494,
'LightGBM/T50': -36.71702409594072,
'RandomForest/T1': -46.13559829033015,
'RandomForest/T2': -42.50831569790795,
'RandomForest/T3': -59.17806740882002,
'RandomForest/T4': -41.48039674285361,
'RandomForest/T5': -71.45798992079756,
'RandomForest/T6': -45.48421951689589,
'RandomForest/T7': -40.1194424887803,
'RandomForest/T8': -44.33414112540297,
'RandomForest/T9': -41.02779535722781,
'RandomForest/T10': -44.30008163558339,
'RandomForest/T11': -50.59384994401495,
'RandomForest/T12': -45.84261449604322,
'RandomForest/T13': -59.425635506891844,
'RandomForest/T14': -59.32411179018046,
'RandomForest/T15': -43.62344712474275,
'RandomForest/T16': -89.4961490430157,
'RandomForest/T17': -39.837431824016605,
'RandomForest/T18': -42.00110440548822,
'RandomForest/T19': -59.310794739879306,
'RandomForest/T20': -40.72125407509644,
'RandomForest/T21': -103.11307070306943,
'RandomForest/T22': -46.489837312990026,
'RandomForest/T23': -50.75846092339611,
'RandomForest/T24': -51.689471882951985,
'RandomForest/T25': -44.31526191830076,
'RandomForest/T26': -89.84127589446588,
'RandomForest/T27': -59.189923784260465,
'RandomForest/T28': -40.86872933381641,
'RandomForest/T29': -103.07161567579304,
'RandomForest/T30': -45.19903966167081,
'RandomForest/T31': -41.73103016580029,
'RandomForest/T32': -50.80685903847291,
'RandomForest/T33': -41.880797855609615,
'RandomForest/T34': -42.30155554792654,
'RandomForest/T35': -43.61951552740999,
'RandomForest/T36': -42.31544792999973,
'RandomForest/T37': -103.063335686949,
'RandomForest/T38': -41.43023896222136,
'RandomForest/T39': -42.359156263859354,
'RandomForest/T40': -103.0081670275758,
'RandomForest/T41': -50.51951043929336,
'RandomForest/T42': -41.11791329573878,
'RandomForest/T43': -40.44157825697954,
'RandomForest/T44': -39.86320039043529,
'RandomForest/T45': -43.59903073340439,
'RandomForest/T46': -71.82685187050713,
'RandomForest/T47': -41.084779374241336,
'RandomForest/T48': -41.1880748423168,
'RandomForest/T49': -41.36132119782271,
'RandomForest/T50': -89.8729035538555,
'XGBoost/T1': -242.20845302129723,
'XGBoost/T2': -57.6241844076945,
'XGBoost/T3': -99.07820524133345,
'XGBoost/T4': -39.34262166472985,
'XGBoost/T5': -139.06408677992582,
'XGBoost/T6': -102.86268484056171,
'XGBoost/T7': -82.89191575475823,
'XGBoost/T8': -88.00462633649433,
'XGBoost/T9': -128.03945177266448,
'XGBoost/T10': -44.135474915002476,
'XGBoost/T11': -192.57490271701795,
'XGBoost/T12': -39.990973504996546,
'XGBoost/T13': -115.60392676380398,
'XGBoost/T14': -219.4567414372106,
'XGBoost/T15': -243.8436029664662,
'XGBoost/T16': -59.826946233603856,
'XGBoost/T17': -170.91886416896565,
'XGBoost/T18': -223.10082673409212,
'XGBoost/T19': -39.53080438867489,
'XGBoost/T20': -239.9045151654351,
'XGBoost/T21': -251.40648526235023,
'XGBoost/T22': -247.57263623823548,
'XGBoost/T23': -38.9507034479014,
'XGBoost/T24': -40.56350199063106,
'XGBoost/T25': -90.8502250181972,
'XGBoost/T26': -232.60793690321609,
'XGBoost/T27': -127.49194126731443,
'XGBoost/T28': -257.43822693775604,
'XGBoost/T29': -109.19245362260664,
'XGBoost/T30': -47.47330579985546,
'XGBoost/T31': -145.60701940822824,
'XGBoost/T32': -50.65309633914242,
'XGBoost/T33': -216.16793471769503,
'XGBoost/T34': -132.04286374875844,
'XGBoost/T35': -140.04196932191303,
'XGBoost/T36': -84.19005875950128,
'XGBoost/T37': -73.80606694880497,
'XGBoost/T38': -171.9385902326494,
'XGBoost/T39': -212.4022895443852,
'XGBoost/T40': -100.50626761109498,
'XGBoost/T41': -212.04501271171767,
'XGBoost/T42': -80.55101333590717,
'XGBoost/T43': -45.49682695766309,
'XGBoost/T44': -39.138286855766665,
'XGBoost/T45': -45.681115165907414,
'XGBoost/T46': -174.9310984451674,
'XGBoost/T47': -131.64062532552526,
'XGBoost/T48': -63.42718987803277,
'XGBoost/T49': -254.33770156486185,
'XGBoost/T50': -134.60409526309095,
'WeightedEnsemble_L2': -34.89805884681114},
'model_best': 'WeightedEnsemble_L2',
'model_paths': {'LightGBM/T1': '/content/drive/MyDrive/models/LightGBM/T1/',
'LightGBM/T2': '/content/drive/MyDrive/models/LightGBM/T2/',
'LightGBM/T3': '/content/drive/MyDrive/models/LightGBM/T3/',
'LightGBM/T4': '/content/drive/MyDrive/models/LightGBM/T4/',
'LightGBM/T5': '/content/drive/MyDrive/models/LightGBM/T5/',
'LightGBM/T6': '/content/drive/MyDrive/models/LightGBM/T6/',
'LightGBM/T7': '/content/drive/MyDrive/models/LightGBM/T7/',
'LightGBM/T8': '/content/drive/MyDrive/models/LightGBM/T8/',
'LightGBM/T9': '/content/drive/MyDrive/models/LightGBM/T9/',
'LightGBM/T10': '/content/drive/MyDrive/models/LightGBM/T10/',
'LightGBM/T11': '/content/drive/MyDrive/models/LightGBM/T11/',
'LightGBM/T12': '/content/drive/MyDrive/models/LightGBM/T12/',
'LightGBM/T13': '/content/drive/MyDrive/models/LightGBM/T13/',
'LightGBM/T14': '/content/drive/MyDrive/models/LightGBM/T14/',
'LightGBM/T15': '/content/drive/MyDrive/models/LightGBM/T15/',
'LightGBM/T16': '/content/drive/MyDrive/models/LightGBM/T16/',
'LightGBM/T17': '/content/drive/MyDrive/models/LightGBM/T17/',
'LightGBM/T18': '/content/drive/MyDrive/models/LightGBM/T18/',
'LightGBM/T19': '/content/drive/MyDrive/models/LightGBM/T19/',
'LightGBM/T20': '/content/drive/MyDrive/models/LightGBM/T20/',
'LightGBM/T21': '/content/drive/MyDrive/models/LightGBM/T21/',
'LightGBM/T22': '/content/drive/MyDrive/models/LightGBM/T22/',
'LightGBM/T23': '/content/drive/MyDrive/models/LightGBM/T23/',
'LightGBM/T24': '/content/drive/MyDrive/models/LightGBM/T24/',
'LightGBM/T25': '/content/drive/MyDrive/models/LightGBM/T25/',
'LightGBM/T26': '/content/drive/MyDrive/models/LightGBM/T26/',
'LightGBM/T27': '/content/drive/MyDrive/models/LightGBM/T27/',
'LightGBM/T28': '/content/drive/MyDrive/models/LightGBM/T28/',
'LightGBM/T29': '/content/drive/MyDrive/models/LightGBM/T29/',
'LightGBM/T30': '/content/drive/MyDrive/models/LightGBM/T30/',
'LightGBM/T31': '/content/drive/MyDrive/models/LightGBM/T31/',
'LightGBM/T32': '/content/drive/MyDrive/models/LightGBM/T32/',
'LightGBM/T33': '/content/drive/MyDrive/models/LightGBM/T33/',
'LightGBM/T34': '/content/drive/MyDrive/models/LightGBM/T34/',
'LightGBM/T35': '/content/drive/MyDrive/models/LightGBM/T35/',
'LightGBM/T36': '/content/drive/MyDrive/models/LightGBM/T36/',
'LightGBM/T37': '/content/drive/MyDrive/models/LightGBM/T37/',
'LightGBM/T38': '/content/drive/MyDrive/models/LightGBM/T38/',
'LightGBM/T39': '/content/drive/MyDrive/models/LightGBM/T39/',
'LightGBM/T40': '/content/drive/MyDrive/models/LightGBM/T40/',
'LightGBM/T41': '/content/drive/MyDrive/models/LightGBM/T41/',
'LightGBM/T42': '/content/drive/MyDrive/models/LightGBM/T42/',
'LightGBM/T43': '/content/drive/MyDrive/models/LightGBM/T43/',
'LightGBM/T44': '/content/drive/MyDrive/models/LightGBM/T44/',
'LightGBM/T45': '/content/drive/MyDrive/models/LightGBM/T45/',
'LightGBM/T46': '/content/drive/MyDrive/models/LightGBM/T46/',
'LightGBM/T47': '/content/drive/MyDrive/models/LightGBM/T47/',
'LightGBM/T48': '/content/drive/MyDrive/models/LightGBM/T48/',
'LightGBM/T49': '/content/drive/MyDrive/models/LightGBM/T49/',
'LightGBM/T50': '/content/drive/MyDrive/models/LightGBM/T50/',
'RandomForest/T1': '/content/drive/MyDrive/models/RandomForest/T1/',
'RandomForest/T2': '/content/drive/MyDrive/models/RandomForest/T2/',
'RandomForest/T3': '/content/drive/MyDrive/models/RandomForest/T3/',
'RandomForest/T4': '/content/drive/MyDrive/models/RandomForest/T4/',
'RandomForest/T5': '/content/drive/MyDrive/models/RandomForest/T5/',
'RandomForest/T6': '/content/drive/MyDrive/models/RandomForest/T6/',
'RandomForest/T7': '/content/drive/MyDrive/models/RandomForest/T7/',
'RandomForest/T8': '/content/drive/MyDrive/models/RandomForest/T8/',
'RandomForest/T9': '/content/drive/MyDrive/models/RandomForest/T9/',
'RandomForest/T10': '/content/drive/MyDrive/models/RandomForest/T10/',
'RandomForest/T11': '/content/drive/MyDrive/models/RandomForest/T11/',
'RandomForest/T12': '/content/drive/MyDrive/models/RandomForest/T12/',
'RandomForest/T13': '/content/drive/MyDrive/models/RandomForest/T13/',
'RandomForest/T14': '/content/drive/MyDrive/models/RandomForest/T14/',
'RandomForest/T15': '/content/drive/MyDrive/models/RandomForest/T15/',
'RandomForest/T16': '/content/drive/MyDrive/models/RandomForest/T16/',
'RandomForest/T17': '/content/drive/MyDrive/models/RandomForest/T17/',
'RandomForest/T18': '/content/drive/MyDrive/models/RandomForest/T18/',
'RandomForest/T19': '/content/drive/MyDrive/models/RandomForest/T19/',
'RandomForest/T20': '/content/drive/MyDrive/models/RandomForest/T20/',
'RandomForest/T21': '/content/drive/MyDrive/models/RandomForest/T21/',
'RandomForest/T22': '/content/drive/MyDrive/models/RandomForest/T22/',
'RandomForest/T23': '/content/drive/MyDrive/models/RandomForest/T23/',
'RandomForest/T24': '/content/drive/MyDrive/models/RandomForest/T24/',
'RandomForest/T25': '/content/drive/MyDrive/models/RandomForest/T25/',
'RandomForest/T26': '/content/drive/MyDrive/models/RandomForest/T26/',
'RandomForest/T27': '/content/drive/MyDrive/models/RandomForest/T27/',
'RandomForest/T28': '/content/drive/MyDrive/models/RandomForest/T28/',
'RandomForest/T29': '/content/drive/MyDrive/models/RandomForest/T29/',
'RandomForest/T30': '/content/drive/MyDrive/models/RandomForest/T30/',
'RandomForest/T31': '/content/drive/MyDrive/models/RandomForest/T31/',
'RandomForest/T32': '/content/drive/MyDrive/models/RandomForest/T32/',
'RandomForest/T33': '/content/drive/MyDrive/models/RandomForest/T33/',
'RandomForest/T34': '/content/drive/MyDrive/models/RandomForest/T34/',
'RandomForest/T35': '/content/drive/MyDrive/models/RandomForest/T35/',
'RandomForest/T36': '/content/drive/MyDrive/models/RandomForest/T36/',
'RandomForest/T37': '/content/drive/MyDrive/models/RandomForest/T37/',
'RandomForest/T38': '/content/drive/MyDrive/models/RandomForest/T38/',
'RandomForest/T39': '/content/drive/MyDrive/models/RandomForest/T39/',
'RandomForest/T40': '/content/drive/MyDrive/models/RandomForest/T40/',
'RandomForest/T41': '/content/drive/MyDrive/models/RandomForest/T41/',
'RandomForest/T42': '/content/drive/MyDrive/models/RandomForest/T42/',
'RandomForest/T43': '/content/drive/MyDrive/models/RandomForest/T43/',
'RandomForest/T44': '/content/drive/MyDrive/models/RandomForest/T44/',
'RandomForest/T45': '/content/drive/MyDrive/models/RandomForest/T45/',
'RandomForest/T46': '/content/drive/MyDrive/models/RandomForest/T46/',
'RandomForest/T47': '/content/drive/MyDrive/models/RandomForest/T47/',
'RandomForest/T48': '/content/drive/MyDrive/models/RandomForest/T48/',
'RandomForest/T49': '/content/drive/MyDrive/models/RandomForest/T49/',
'RandomForest/T50': '/content/drive/MyDrive/models/RandomForest/T50/',
'XGBoost/T1': '/content/drive/MyDrive/models/XGBoost/T1/',
'XGBoost/T2': '/content/drive/MyDrive/models/XGBoost/T2/',
'XGBoost/T3': '/content/drive/MyDrive/models/XGBoost/T3/',
'XGBoost/T4': '/content/drive/MyDrive/models/XGBoost/T4/',
'XGBoost/T5': '/content/drive/MyDrive/models/XGBoost/T5/',
'XGBoost/T6': '/content/drive/MyDrive/models/XGBoost/T6/',
'XGBoost/T7': '/content/drive/MyDrive/models/XGBoost/T7/',
'XGBoost/T8': '/content/drive/MyDrive/models/XGBoost/T8/',
'XGBoost/T9': '/content/drive/MyDrive/models/XGBoost/T9/',
'XGBoost/T10': '/content/drive/MyDrive/models/XGBoost/T10/',
'XGBoost/T11': '/content/drive/MyDrive/models/XGBoost/T11/',
'XGBoost/T12': '/content/drive/MyDrive/models/XGBoost/T12/',
'XGBoost/T13': '/content/drive/MyDrive/models/XGBoost/T13/',
'XGBoost/T14': '/content/drive/MyDrive/models/XGBoost/T14/',
'XGBoost/T15': '/content/drive/MyDrive/models/XGBoost/T15/',
'XGBoost/T16': '/content/drive/MyDrive/models/XGBoost/T16/',
'XGBoost/T17': '/content/drive/MyDrive/models/XGBoost/T17/',
'XGBoost/T18': '/content/drive/MyDrive/models/XGBoost/T18/',
'XGBoost/T19': '/content/drive/MyDrive/models/XGBoost/T19/',
'XGBoost/T20': '/content/drive/MyDrive/models/XGBoost/T20/',
'XGBoost/T21': '/content/drive/MyDrive/models/XGBoost/T21/',
'XGBoost/T22': '/content/drive/MyDrive/models/XGBoost/T22/',
'XGBoost/T23': '/content/drive/MyDrive/models/XGBoost/T23/',
'XGBoost/T24': '/content/drive/MyDrive/models/XGBoost/T24/',
'XGBoost/T25': '/content/drive/MyDrive/models/XGBoost/T25/',
'XGBoost/T26': '/content/drive/MyDrive/models/XGBoost/T26/',
'XGBoost/T27': '/content/drive/MyDrive/models/XGBoost/T27/',
'XGBoost/T28': '/content/drive/MyDrive/models/XGBoost/T28/',
'XGBoost/T29': '/content/drive/MyDrive/models/XGBoost/T29/',
'XGBoost/T30': '/content/drive/MyDrive/models/XGBoost/T30/',
'XGBoost/T31': '/content/drive/MyDrive/models/XGBoost/T31/',
'XGBoost/T32': '/content/drive/MyDrive/models/XGBoost/T32/',
'XGBoost/T33': '/content/drive/MyDrive/models/XGBoost/T33/',
'XGBoost/T34': '/content/drive/MyDrive/models/XGBoost/T34/',
'XGBoost/T35': '/content/drive/MyDrive/models/XGBoost/T35/',
'XGBoost/T36': '/content/drive/MyDrive/models/XGBoost/T36/',
'XGBoost/T37': '/content/drive/MyDrive/models/XGBoost/T37/',
'XGBoost/T38': '/content/drive/MyDrive/models/XGBoost/T38/',
'XGBoost/T39': '/content/drive/MyDrive/models/XGBoost/T39/',
'XGBoost/T40': '/content/drive/MyDrive/models/XGBoost/T40/',
'XGBoost/T41': '/content/drive/MyDrive/models/XGBoost/T41/',
'XGBoost/T42': '/content/drive/MyDrive/models/XGBoost/T42/',
'XGBoost/T43': '/content/drive/MyDrive/models/XGBoost/T43/',
'XGBoost/T44': '/content/drive/MyDrive/models/XGBoost/T44/',
'XGBoost/T45': '/content/drive/MyDrive/models/XGBoost/T45/',
'XGBoost/T46': '/content/drive/MyDrive/models/XGBoost/T46/',
'XGBoost/T47': '/content/drive/MyDrive/models/XGBoost/T47/',
'XGBoost/T48': '/content/drive/MyDrive/models/XGBoost/T48/',
'XGBoost/T49': '/content/drive/MyDrive/models/XGBoost/T49/',
'XGBoost/T50': '/content/drive/MyDrive/models/XGBoost/T50/',
'WeightedEnsemble_L2': '/content/drive/MyDrive/models/WeightedEnsemble_L2/'},
'model_fit_times': {'LightGBM/T1': 1.53041672706604,
'LightGBM/T2': 2.277196168899536,
'LightGBM/T3': 2.176133871078491,
'LightGBM/T4': 1.2290914058685303,
'LightGBM/T5': 0.7572953701019287,
'LightGBM/T6': 1.5328567028045654,
'LightGBM/T7': 0.9641306400299072,
'LightGBM/T8': 1.7758071422576904,
'LightGBM/T9': 0.5225627422332764,
'LightGBM/T10': 1.3175263404846191,
'LightGBM/T11': 0.40703392028808594,
'LightGBM/T12': 1.0889384746551514,
'LightGBM/T13': 1.4186010360717773,
'LightGBM/T14': 0.4464681148529053,
'LightGBM/T15': 1.8004169464111328,
'LightGBM/T16': 0.481395959854126,
'LightGBM/T17': 0.6127426624298096,
'LightGBM/T18': 2.6406991481781006,
'LightGBM/T19': 2.1092042922973633,
'LightGBM/T20': 2.0332889556884766,
'LightGBM/T21': 1.607170581817627,
'LightGBM/T22': 0.829031229019165,
'LightGBM/T23': 1.0539677143096924,
'LightGBM/T24': 1.8550477027893066,
'LightGBM/T25': 0.6701929569244385,
'LightGBM/T26': 0.3511927127838135,
'LightGBM/T27': 0.7366065979003906,
'LightGBM/T28': 0.5016770362854004,
'LightGBM/T29': 0.6341521739959717,
'LightGBM/T30': 0.8088109493255615,
'LightGBM/T31': 0.8378710746765137,
'LightGBM/T32': 0.6022360324859619,
'LightGBM/T33': 0.611915111541748,
'LightGBM/T34': 1.3329949378967285,
'LightGBM/T35': 1.1201927661895752,
'LightGBM/T36': 0.8471930027008057,
'LightGBM/T37': 1.2084722518920898,
'LightGBM/T38': 1.8269567489624023,
'LightGBM/T39': 1.8281617164611816,
'LightGBM/T40': 1.2085320949554443,
'LightGBM/T41': 1.020493984222412,
'LightGBM/T42': 0.7510998249053955,
'LightGBM/T43': 1.1217613220214844,
'LightGBM/T44': 0.825087308883667,
'LightGBM/T45': 1.1622686386108398,
'LightGBM/T46': 1.1716973781585693,
'LightGBM/T47': 1.587203025817871,
'LightGBM/T48': 1.0990979671478271,
'LightGBM/T49': 0.7969474792480469,
'LightGBM/T50': 1.0830273628234863,
'RandomForest/T1': 5.272322177886963,
'RandomForest/T2': 12.639430522918701,
'RandomForest/T3': 7.502266883850098,
'RandomForest/T4': 15.639352083206177,
'RandomForest/T5': 10.296333074569702,
'RandomForest/T6': 20.09404444694519,
'RandomForest/T7': 28.198174238204956,
'RandomForest/T8': 10.975008010864258,
'RandomForest/T9': 28.13679265975952,
'RandomForest/T10': 26.28503441810608,
'RandomForest/T11': 22.914005756378174,
'RandomForest/T12': 27.344127893447876,
'RandomForest/T13': 5.079820871353149,
'RandomForest/T14': 7.884782552719116,
'RandomForest/T15': 10.686238288879395,
'RandomForest/T16': 14.863109350204468,
'RandomForest/T17': 30.815284490585327,
'RandomForest/T18': 6.784108638763428,
'RandomForest/T19': 10.027472019195557,
'RandomForest/T20': 10.965287208557129,
'RandomForest/T21': 3.261698007583618,
'RandomForest/T22': 17.739964962005615,
'RandomForest/T23': 10.09641432762146,
'RandomForest/T24': 4.44470477104187,
'RandomForest/T25': 9.838387250900269,
'RandomForest/T26': 1.988867998123169,
'RandomForest/T27': 4.4161365032196045,
'RandomForest/T28': 9.019819974899292,
'RandomForest/T29': 8.473232746124268,
'RandomForest/T30': 3.974369525909424,
'RandomForest/T31': 29.855939626693726,
'RandomForest/T32': 7.246171474456787,
'RandomForest/T33': 6.884170770645142,
'RandomForest/T34': 12.005346298217773,
'RandomForest/T35': 23.809181690216064,
'RandomForest/T36': 31.449130535125732,
'RandomForest/T37': 8.024420261383057,
'RandomForest/T38': 18.764225006103516,
'RandomForest/T39': 26.805087089538574,
'RandomForest/T40': 14.819584846496582,
'RandomForest/T41': 21.966599225997925,
'RandomForest/T42': 23.935501098632812,
'RandomForest/T43': 9.908036947250366,
'RandomForest/T44': 23.475661516189575,
'RandomForest/T45': 24.132289171218872,
'RandomForest/T46': 5.259775161743164,
'RandomForest/T47': 28.997342586517334,
'RandomForest/T48': 23.230534315109253,
'RandomForest/T49': 7.503018617630005,
'RandomForest/T50': 7.945054531097412,
'XGBoost/T1': 1.9734244346618652,
'XGBoost/T2': 4.508175849914551,
'XGBoost/T3': 3.234741687774658,
'XGBoost/T4': 20.52107262611389,
'XGBoost/T5': 2.4765255451202393,
'XGBoost/T6': 2.149665117263794,
'XGBoost/T7': 1.4586303234100342,
'XGBoost/T8': 10.592931747436523,
'XGBoost/T9': 2.8597545623779297,
'XGBoost/T10': 11.449515104293823,
'XGBoost/T11': 8.360250473022461,
'XGBoost/T12': 15.796384572982788,
'XGBoost/T13': 6.975407123565674,
'XGBoost/T14': 3.268717050552368,
'XGBoost/T15': 2.0600709915161133,
'XGBoost/T16': 4.072570562362671,
'XGBoost/T17': 4.61553168296814,
'XGBoost/T18': 4.324323415756226,
'XGBoost/T19': 62.648386001586914,
'XGBoost/T20': 8.94611668586731,
'XGBoost/T21': 4.017971515655518,
'XGBoost/T22': 2.8377420902252197,
'XGBoost/T23': 17.224652767181396,
'XGBoost/T24': 2.870816230773926,
'XGBoost/T25': 17.270075798034668,
'XGBoost/T26': 3.473790407180786,
'XGBoost/T27': 8.505638599395752,
'XGBoost/T28': 0.9878067970275879,
'XGBoost/T29': 2.5416154861450195,
'XGBoost/T30': 6.5248496532440186,
'XGBoost/T31': 4.90080189704895,
'XGBoost/T32': 14.999533414840698,
'XGBoost/T33': 8.485580682754517,
'XGBoost/T34': 12.43358302116394,
'XGBoost/T35': 6.630891799926758,
'XGBoost/T36': 14.910847663879395,
'XGBoost/T37': 3.9422669410705566,
'XGBoost/T38': 3.5091912746429443,
'XGBoost/T39': 4.03452205657959,
'XGBoost/T40': 2.687711715698242,
'XGBoost/T41': 5.549245834350586,
'XGBoost/T42': 1.4140501022338867,
'XGBoost/T43': 3.8183138370513916,
'XGBoost/T44': 11.221673727035522,
'XGBoost/T45': 14.714321613311768,
'XGBoost/T46': 2.4239187240600586,
'XGBoost/T47': 9.522579193115234,
'XGBoost/T48': 5.230167388916016,
'XGBoost/T49': 1.164597749710083,
'XGBoost/T50': 0.9466433525085449,
'WeightedEnsemble_L2': 0.4817075729370117},
'model_pred_times': {'LightGBM/T1': 0.024150848388671875,
'LightGBM/T2': 0.07749462127685547,
'LightGBM/T3': 0.154005765914917,
'LightGBM/T4': 0.04028034210205078,
'LightGBM/T5': 0.0302426815032959,
'LightGBM/T6': 0.09199905395507812,
'LightGBM/T7': 0.039168596267700195,
'LightGBM/T8': 0.13998150825500488,
'LightGBM/T9': 0.016724824905395508,
'LightGBM/T10': 0.09208250045776367,
'LightGBM/T11': 0.012154817581176758,
'LightGBM/T12': 0.08932662010192871,
'LightGBM/T13': 0.10099411010742188,
'LightGBM/T14': 0.019153594970703125,
'LightGBM/T15': 0.18682169914245605,
'LightGBM/T16': 0.024320602416992188,
'LightGBM/T17': 0.03761029243469238,
'LightGBM/T18': 0.133436918258667,
'LightGBM/T19': 0.0971369743347168,
'LightGBM/T20': 0.19867634773254395,
'LightGBM/T21': 0.09888100624084473,
'LightGBM/T22': 0.0213162899017334,
'LightGBM/T23': 0.04351973533630371,
'LightGBM/T24': 0.11065959930419922,
'LightGBM/T25': 0.02602982521057129,
'LightGBM/T26': 0.008403539657592773,
'LightGBM/T27': 0.04492545127868652,
'LightGBM/T28': 0.02837228775024414,
'LightGBM/T29': 0.04932737350463867,
'LightGBM/T30': 0.05037355422973633,
'LightGBM/T31': 0.05650639533996582,
'LightGBM/T32': 0.03026437759399414,
'LightGBM/T33': 0.04029345512390137,
'LightGBM/T34': 0.09021997451782227,
'LightGBM/T35': 0.07592105865478516,
'LightGBM/T36': 0.05840873718261719,
'LightGBM/T37': 0.051133155822753906,
'LightGBM/T38': 0.13481783866882324,
'LightGBM/T39': 0.09738540649414062,
'LightGBM/T40': 0.05235409736633301,
'LightGBM/T41': 0.06004905700683594,
'LightGBM/T42': 0.022963523864746094,
'LightGBM/T43': 0.059575557708740234,
'LightGBM/T44': 0.029697895050048828,
'LightGBM/T45': 0.07898283004760742,
'LightGBM/T46': 0.07057452201843262,
'LightGBM/T47': 0.04213905334472656,
'LightGBM/T48': 0.07326149940490723,
'LightGBM/T49': 0.06016230583190918,
'LightGBM/T50': 0.06862497329711914,
'RandomForest/T1': 0.31788086891174316,
'RandomForest/T2': 0.30679917335510254,
'RandomForest/T3': 0.45139145851135254,
'RandomForest/T4': 0.5956261157989502,
'RandomForest/T5': 0.9283509254455566,
'RandomForest/T6': 0.6860215663909912,
'RandomForest/T7': 0.7497737407684326,
'RandomForest/T8': 0.7306296825408936,
'RandomForest/T9': 0.9259183406829834,
'RandomForest/T10': 0.9428396224975586,
'RandomForest/T11': 0.8513131141662598,
'RandomForest/T12': 0.9355003833770752,
'RandomForest/T13': 0.375960111618042,
'RandomForest/T14': 0.34381699562072754,
'RandomForest/T15': 0.4086642265319824,
'RandomForest/T16': 0.993722677230835,
'RandomForest/T17': 1.0113906860351562,
'RandomForest/T18': 0.36931896209716797,
'RandomForest/T19': 0.3857839107513428,
'RandomForest/T20': 0.3606607913970947,
'RandomForest/T21': 0.2520570755004883,
'RandomForest/T22': 0.5607776641845703,
'RandomForest/T23': 0.6357741355895996,
'RandomForest/T24': 0.19492101669311523,
'RandomForest/T25': 0.3591766357421875,
'RandomForest/T26': 0.13337254524230957,
'RandomForest/T27': 0.32607460021972656,
'RandomForest/T28': 0.33582115173339844,
'RandomForest/T29': 0.5476925373077393,
'RandomForest/T30': 0.2348177433013916,
'RandomForest/T31': 1.3984646797180176,
'RandomForest/T32': 0.3204059600830078,
'RandomForest/T33': 0.30837321281433105,
'RandomForest/T34': 0.517707109451294,
'RandomForest/T35': 1.2617216110229492,
'RandomForest/T36': 1.0658645629882812,
'RandomForest/T37': 1.090327262878418,
'RandomForest/T38': 0.5927853584289551,
'RandomForest/T39': 0.7669970989227295,
'RandomForest/T40': 0.9165647029876709,
'RandomForest/T41': 1.2624006271362305,
'RandomForest/T42': 1.090775966644287,
'RandomForest/T43': 0.41025876998901367,
'RandomForest/T44': 0.7306375503540039,
'RandomForest/T45': 0.8703584671020508,
'RandomForest/T46': 0.35295748710632324,
'RandomForest/T47': 1.3871641159057617,
'RandomForest/T48': 1.1001417636871338,
'RandomForest/T49': 0.3016951084136963,
'RandomForest/T50': 0.6590213775634766,
'XGBoost/T1': 0.1170501708984375,
'XGBoost/T2': 0.46074795722961426,
'XGBoost/T3': 0.33947110176086426,
'XGBoost/T4': 0.43109941482543945,
'XGBoost/T5': 0.1595017910003662,
'XGBoost/T6': 0.15887022018432617,
'XGBoost/T7': 0.08061099052429199,
'XGBoost/T8': 0.20091915130615234,
'XGBoost/T9': 0.08373737335205078,
'XGBoost/T10': 1.0026230812072754,
'XGBoost/T11': 0.3869285583496094,
'XGBoost/T12': 0.9459943771362305,
'XGBoost/T13': 0.17574763298034668,
'XGBoost/T14': 0.23408722877502441,
'XGBoost/T15': 0.06863093376159668,
'XGBoost/T16': 0.1783277988433838,
'XGBoost/T17': 0.175919771194458,
'XGBoost/T18': 0.3298633098602295,
'XGBoost/T19': 2.5571391582489014,
'XGBoost/T20': 0.3894309997558594,
'XGBoost/T21': 0.19867229461669922,
'XGBoost/T22': 0.14209318161010742,
'XGBoost/T23': 0.5387907028198242,
'XGBoost/T24': 0.22602033615112305,
'XGBoost/T25': 1.1916720867156982,
'XGBoost/T26': 0.1712808609008789,
'XGBoost/T27': 0.5334155559539795,
'XGBoost/T28': 0.08040666580200195,
'XGBoost/T29': 0.12222528457641602,
'XGBoost/T30': 0.21627140045166016,
'XGBoost/T31': 0.2920846939086914,
'XGBoost/T32': 0.5702698230743408,
'XGBoost/T33': 0.2766561508178711,
'XGBoost/T34': 0.721215009689331,
'XGBoost/T35': 0.1920487880706787,
'XGBoost/T36': 0.8988864421844482,
'XGBoost/T37': 0.13048481941223145,
'XGBoost/T38': 0.10977625846862793,
'XGBoost/T39': 0.2384788990020752,
'XGBoost/T40': 0.1231231689453125,
'XGBoost/T41': 0.3763415813446045,
'XGBoost/T42': 0.09672260284423828,
'XGBoost/T43': 0.2073371410369873,
'XGBoost/T44': 1.1342575550079346,
'XGBoost/T45': 0.3488612174987793,
'XGBoost/T46': 0.11169910430908203,
'XGBoost/T47': 0.8867418766021729,
'XGBoost/T48': 0.2202160358428955,
'XGBoost/T49': 0.0410914421081543,
'XGBoost/T50': 0.03510785102844238,
'WeightedEnsemble_L2': 0.0004909038543701172},
'num_bag_folds': 0,
'max_stack_level': 2,
'model_hyperparams': {'LightGBM/T1': {'learning_rate': 0.05,
'num_boost_round': 100,
'num_leaves': 36,
'feature_fraction': 1.0,
'min_data_in_leaf': 20},
'LightGBM/T2': {'learning_rate': 0.06994332504138305,
'num_boost_round': 301,
'num_leaves': 77,
'feature_fraction': 0.8872033759818312,
'min_data_in_leaf': 5},
'LightGBM/T3': {'learning_rate': 0.049883446878335284,
'num_boost_round': 342,
'num_leaves': 97,
'feature_fraction': 0.9618129346960314,
'min_data_in_leaf': 52},
'LightGBM/T4': {'learning_rate': 0.17491036983395955,
'num_boost_round': 243,
'num_leaves': 49,
'feature_fraction': 0.97294325019552,
'min_data_in_leaf': 60},
'LightGBM/T5': {'learning_rate': 0.040645406276287224,
'num_boost_round': 75,
'num_leaves': 87,
'feature_fraction': 0.8822237299382261,
'min_data_in_leaf': 39},
'LightGBM/T6': {'learning_rate': 0.0546235885389664,
'num_boost_round': 293,
'num_leaves': 79,
'feature_fraction': 0.8343490401043171,
'min_data_in_leaf': 18},
'LightGBM/T7': {'learning_rate': 0.12381739396810261,
'num_boost_round': 149,
'num_leaves': 98,
'feature_fraction': 0.9445391877374626,
'min_data_in_leaf': 20},
'LightGBM/T8': {'learning_rate': 0.03410406526204423,
'num_boost_round': 448,
'num_leaves': 49,
'feature_fraction': 0.9502276879949111,
'min_data_in_leaf': 21},
'LightGBM/T9': {'learning_rate': 0.008484584333008584,
'num_boost_round': 81,
'num_leaves': 84,
'feature_fraction': 0.909980255331881,
'min_data_in_leaf': 34},
'LightGBM/T10': {'learning_rate': 0.01326795426943024,
'num_boost_round': 420,
'num_leaves': 65,
'feature_fraction': 0.8536654849976308,
'min_data_in_leaf': 13},
'LightGBM/T11': {'learning_rate': 0.008233749120302263,
'num_boost_round': 103,
'num_leaves': 15,
'feature_fraction': 0.804137588606093,
'min_data_in_leaf': 38},
'LightGBM/T12': {'learning_rate': 0.04867828600398975,
'num_boost_round': 438,
'num_leaves': 52,
'feature_fraction': 0.9030239306806054,
'min_data_in_leaf': 17},
'LightGBM/T13': {'learning_rate': 0.02506809965134726,
'num_boost_round': 475,
'num_leaves': 67,
'feature_fraction': 0.8398769751434465,
'min_data_in_leaf': 3},
'LightGBM/T14': {'learning_rate': 0.05934335612536781,
'num_boost_round': 132,
'num_leaves': 10,
'feature_fraction': 0.9166916788614169,
'min_data_in_leaf': 48},
'LightGBM/T15': {'learning_rate': 0.01912742372887113,
'num_boost_round': 476,
'num_leaves': 94,
'feature_fraction': 0.8288570877310459,
'min_data_in_leaf': 59},
'LightGBM/T16': {'learning_rate': 0.17189540252926233,
'num_boost_round': 177,
'num_leaves': 13,
'feature_fraction': 0.9085685144893337,
'min_data_in_leaf': 49},
'LightGBM/T17': {'learning_rate': 0.04276900414766508,
'num_boost_round': 193,
'num_leaves': 30,
'feature_fraction': 0.9988248919194719,
'min_data_in_leaf': 16},
'LightGBM/T18': {'learning_rate': 0.0089877051000196,
'num_boost_round': 423,
'num_leaves': 95,
'feature_fraction': 0.8111063980004007,
'min_data_in_leaf': 15},
'LightGBM/T19': {'learning_rate': 0.0882929770449357,
'num_boost_round': 498,
'num_leaves': 79,
'feature_fraction': 0.829300435517324,
'min_data_in_leaf': 37},
'LightGBM/T20': {'learning_rate': 0.049751572809754084,
'num_boost_round': 468,
'num_leaves': 58,
'feature_fraction': 0.7533929089030275,
'min_data_in_leaf': 38},
'LightGBM/T21': {'learning_rate': 0.028169383631645055,
'num_boost_round': 418,
'num_leaves': 31,
'feature_fraction': 0.994114866253349,
'min_data_in_leaf': 15},
'LightGBM/T22': {'learning_rate': 0.005777649586996868,
'num_boost_round': 93,
'num_leaves': 68,
'feature_fraction': 0.9348158948495754,
'min_data_in_leaf': 52},
'LightGBM/T23': {'learning_rate': 0.02947068905208791,
'num_boost_round': 148,
'num_leaves': 72,
'feature_fraction': 0.8398611159923304,
'min_data_in_leaf': 59},
'LightGBM/T24': {'learning_rate': 0.006334876341847757,
'num_boost_round': 480,
'num_leaves': 30,
'feature_fraction': 0.8535657486286675,
'min_data_in_leaf': 20},
'LightGBM/T25': {'learning_rate': 0.0344544067404272,
'num_boost_round': 108,
'num_leaves': 75,
'feature_fraction': 0.8163473727348614,
'min_data_in_leaf': 43},
'LightGBM/T26': {'learning_rate': 0.013928097561857549,
'num_boost_round': 61,
'num_leaves': 12,
'feature_fraction': 0.7707781231576506,
'min_data_in_leaf': 42},
'LightGBM/T27': {'learning_rate': 0.07023751838056898,
'num_boost_round': 344,
'num_leaves': 29,
'feature_fraction': 0.782949465601098,
'min_data_in_leaf': 2},
'LightGBM/T28': {'learning_rate': 0.03842582314761293,
'num_boost_round': 127,
'num_leaves': 40,
'feature_fraction': 0.8494551881896573,
'min_data_in_leaf': 42},
'LightGBM/T29': {'learning_rate': 0.04088206861444131,
'num_boost_round': 276,
'num_leaves': 23,
'feature_fraction': 0.786610440682278,
'min_data_in_leaf': 32},
'LightGBM/T30': {'learning_rate': 0.08133889866969515,
'num_boost_round': 378,
'num_leaves': 36,
'feature_fraction': 0.858322015495511,
'min_data_in_leaf': 33},
'LightGBM/T31': {'learning_rate': 0.01138571117663381,
'num_boost_round': 367,
'num_leaves': 24,
'feature_fraction': 0.8930629764477184,
'min_data_in_leaf': 46},
'LightGBM/T32': {'learning_rate': 0.1847988811264093,
'num_boost_round': 136,
'num_leaves': 87,
'feature_fraction': 0.862299434554593,
'min_data_in_leaf': 13},
'LightGBM/T33': {'learning_rate': 0.06040375437764251,
'num_boost_round': 202,
'num_leaves': 39,
'feature_fraction': 0.8529550347412111,
'min_data_in_leaf': 18},
'LightGBM/T34': {'learning_rate': 0.0643349781106551,
'num_boost_round': 424,
'num_leaves': 71,
'feature_fraction': 0.9704338404637132,
'min_data_in_leaf': 59},
'LightGBM/T35': {'learning_rate': 0.02145726185510542,
'num_boost_round': 301,
'num_leaves': 80,
'feature_fraction': 0.8076332558765824,
'min_data_in_leaf': 38},
'LightGBM/T36': {'learning_rate': 0.05709625404189839,
'num_boost_round': 202,
'num_leaves': 89,
'feature_fraction': 0.8253937041686373,
'min_data_in_leaf': 58},
'LightGBM/T37': {'learning_rate': 0.12773300009489635,
'num_boost_round': 317,
'num_leaves': 48,
'feature_fraction': 0.9372924592881812,
'min_data_in_leaf': 17},
'LightGBM/T38': {'learning_rate': 0.044216423894947336,
'num_boost_round': 424,
'num_leaves': 54,
'feature_fraction': 0.8924912276753162,
'min_data_in_leaf': 33},
'LightGBM/T39': {'learning_rate': 0.016339051515565292,
'num_boost_round': 309,
'num_leaves': 86,
'feature_fraction': 0.9922404413150684,
'min_data_in_leaf': 20},
'LightGBM/T40': {'learning_rate': 0.13424086030354207,
'num_boost_round': 265,
'num_leaves': 53,
'feature_fraction': 0.8589662313164067,
'min_data_in_leaf': 7},
'LightGBM/T41': {'learning_rate': 0.14860598905853675,
'num_boost_round': 232,
'num_leaves': 47,
'feature_fraction': 0.7750567218280753,
'min_data_in_leaf': 12},
'LightGBM/T42': {'learning_rate': 0.12295891670384675,
'num_boost_round': 77,
'num_leaves': 93,
'feature_fraction': 0.7873620761644984,
'min_data_in_leaf': 4},
'LightGBM/T43': {'learning_rate': 0.08852307144885688,
'num_boost_round': 198,
'num_leaves': 63,
'feature_fraction': 0.7917118782621249,
'min_data_in_leaf': 48},
'LightGBM/T44': {'learning_rate': 0.02245443200997416,
'num_boost_round': 123,
'num_leaves': 51,
'feature_fraction': 0.8922751846536483,
'min_data_in_leaf': 11},
'LightGBM/T45': {'learning_rate': 0.02354810413108711,
'num_boost_round': 414,
'num_leaves': 13,
'feature_fraction': 0.8433226876958138,
'min_data_in_leaf': 52},
'LightGBM/T46': {'learning_rate': 0.11749470619952883,
'num_boost_round': 249,
'num_leaves': 49,
'feature_fraction': 0.9938803762507215,
'min_data_in_leaf': 47},
'LightGBM/T47': {'learning_rate': 0.07952685249186107,
'num_boost_round': 468,
'num_leaves': 98,
'feature_fraction': 0.9860308798588636,
'min_data_in_leaf': 36},
'LightGBM/T48': {'learning_rate': 0.010456261483746194,
'num_boost_round': 306,
'num_leaves': 85,
'feature_fraction': 0.7635844970848134,
'min_data_in_leaf': 38},
'LightGBM/T49': {'learning_rate': 0.12591815432164596,
'num_boost_round': 496,
'num_leaves': 18,
'feature_fraction': 0.8658627443461208,
'min_data_in_leaf': 18},
'LightGBM/T50': {'learning_rate': 0.09003577292150514,
'num_boost_round': 426,
'num_leaves': 50,
'feature_fraction': 0.8991638515737632,
'min_data_in_leaf': 37},
'RandomForest/T1': {'n_estimators': 200,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 10,
'min_samples_leaf': 1},
'RandomForest/T2': {'n_estimators': 292,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 17,
'min_samples_leaf': 6},
'RandomForest/T3': {'n_estimators': 459,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 8,
'min_samples_leaf': 4},
'RandomForest/T4': {'n_estimators': 377,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 14,
'min_samples_leaf': 4},
'RandomForest/T5': {'n_estimators': 699,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 7,
'min_samples_leaf': 5},
'RandomForest/T6': {'n_estimators': 700,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 11,
'min_samples_leaf': 9},
'RandomForest/T7': {'n_estimators': 586,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 17,
'min_samples_leaf': 2},
'RandomForest/T8': {'n_estimators': 274,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 12,
'min_samples_leaf': 8},
'RandomForest/T9': {'n_estimators': 777,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 13,
'min_samples_leaf': 2},
'RandomForest/T10': {'n_estimators': 877,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 14,
'min_samples_leaf': 9},
'RandomForest/T11': {'n_estimators': 855,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 9,
'min_samples_leaf': 4},
'RandomForest/T12': {'n_estimators': 950,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 10,
'min_samples_leaf': 1},
'RandomForest/T13': {'n_estimators': 277,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 8,
'min_samples_leaf': 9},
'RandomForest/T14': {'n_estimators': 247,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 8,
'min_samples_leaf': 4},
'RandomForest/T15': {'n_estimators': 388,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 19,
'min_samples_leaf': 8},
'RandomForest/T16': {'n_estimators': 797,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 6,
'min_samples_leaf': 10},
'RandomForest/T17': {'n_estimators': 643,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 20,
'min_samples_leaf': 1},
'RandomForest/T18': {'n_estimators': 251,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 15,
'min_samples_leaf': 5},
'RandomForest/T19': {'n_estimators': 283,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 8,
'min_samples_leaf': 3},
'RandomForest/T20': {'n_estimators': 228,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 17,
'min_samples_leaf': 3},
'RandomForest/T21': {'n_estimators': 153,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 5,
'min_samples_leaf': 5},
'RandomForest/T22': {'n_estimators': 588,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 10,
'min_samples_leaf': 7},
'RandomForest/T23': {'n_estimators': 435,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 9,
'min_samples_leaf': 2},
'RandomForest/T24': {'n_estimators': 142,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 9,
'min_samples_leaf': 10},
'RandomForest/T25': {'n_estimators': 357,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 15,
'min_samples_leaf': 9},
'RandomForest/T26': {'n_estimators': 157,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 6,
'min_samples_leaf': 8},
'RandomForest/T27': {'n_estimators': 219,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 8,
'min_samples_leaf': 7},
'RandomForest/T28': {'n_estimators': 191,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 16,
'min_samples_leaf': 3},
'RandomForest/T29': {'n_estimators': 665,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 5,
'min_samples_leaf': 4},
'RandomForest/T30': {'n_estimators': 184,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 17,
'min_samples_leaf': 10},
'RandomForest/T31': {'n_estimators': 874,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 16,
'min_samples_leaf': 5},
'RandomForest/T32': {'n_estimators': 231,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 9,
'min_samples_leaf': 5},
'RandomForest/T33': {'n_estimators': 280,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 17,
'min_samples_leaf': 5},
'RandomForest/T34': {'n_estimators': 327,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 13,
'min_samples_leaf': 5},
'RandomForest/T35': {'n_estimators': 819,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 15,
'min_samples_leaf': 8},
'RandomForest/T36': {'n_estimators': 953,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 18,
'min_samples_leaf': 6},
'RandomForest/T37': {'n_estimators': 681,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 5,
'min_samples_leaf': 2},
'RandomForest/T38': {'n_estimators': 548,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 14,
'min_samples_leaf': 4},
'RandomForest/T39': {'n_estimators': 706,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 20,
'min_samples_leaf': 6},
'RandomForest/T40': {'n_estimators': 790,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 5,
'min_samples_leaf': 2},
'RandomForest/T41': {'n_estimators': 916,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 9,
'min_samples_leaf': 3},
'RandomForest/T42': {'n_estimators': 710,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 18,
'min_samples_leaf': 4},
'RandomForest/T43': {'n_estimators': 301,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 15,
'min_samples_leaf': 1},
'RandomForest/T44': {'n_estimators': 470,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 20,
'min_samples_leaf': 1},
'RandomForest/T45': {'n_estimators': 799,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 16,
'min_samples_leaf': 8},
'RandomForest/T46': {'n_estimators': 198,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 7,
'min_samples_leaf': 10},
'RandomForest/T47': {'n_estimators': 834,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 19,
'min_samples_leaf': 4},
'RandomForest/T48': {'n_estimators': 694,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 16,
'min_samples_leaf': 4},
'RandomForest/T49': {'n_estimators': 248,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 19,
'min_samples_leaf': 4},
'RandomForest/T50': {'n_estimators': 511,
'max_leaf_nodes': 15000,
'n_jobs': -1,
'random_state': 0,
'bootstrap': True,
'max_depth': 6,
'min_samples_leaf': 3},
'XGBoost/T1': {'n_estimators': 200,
'learning_rate': 0.0005,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 10,
'min_child_weight': 1,
'colsample_bytree': 1.0},
'XGBoost/T2': {'n_estimators': 807,
'learning_rate': 0.0026938830192854094,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 8,
'min_child_weight': 4,
'colsample_bytree': 0.7744067519636624},
'XGBoost/T3': {'n_estimators': 699,
'learning_rate': 0.0017665559365643198,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 7,
'min_child_weight': 5,
'colsample_bytree': 0.9236258693920627},
'XGBoost/T4': {'n_estimators': 586,
'learning_rate': 0.00845912652804937,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 15,
'min_child_weight': 2,
'colsample_bytree': 0.9458865003910399},
'XGBoost/T5': {'n_estimators': 945,
'learning_rate': 0.000911914969166495,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 6,
'min_child_weight': 2,
'colsample_bytree': 0.9060843643877465},
'XGBoost/T6': {'n_estimators': 809,
'learning_rate': 0.001978535031094701,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 5,
'min_child_weight': 4,
'colsample_bytree': 0.6686980802086342},
'XGBoost/T7': {'n_estimators': 277,
'learning_rate': 0.005495716186411616,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 7,
'min_child_weight': 4,
'colsample_bytree': 0.8890783754749252},
'XGBoost/T8': {'n_estimators': 365,
'learning_rate': 0.003639639356786389,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 19,
'min_child_weight': 1,
'colsample_bytree': 0.7307396811264659},
'XGBoost/T9': {'n_estimators': 251,
'learning_rate': 0.00329026781026266,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 15,
'min_child_weight': 5,
'colsample_bytree': 0.7686866147245053},
'XGBoost/T10': {'n_estimators': 902,
'learning_rate': 0.003535634629148811,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 12,
'min_child_weight': 5,
'colsample_bytree': 0.6322778060523135},
'XGBoost/T11': {'n_estimators': 856,
'learning_rate': 0.0004449200891662953,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 10,
'min_child_weight': 1,
'colsample_bytree': 0.567609086702726},
'XGBoost/T12': {'n_estimators': 643,
'learning_rate': 0.006385530021358921,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 15,
'min_child_weight': 3,
'colsample_bytree': 0.6932444905629309},
'XGBoost/T13': {'n_estimators': 391,
'learning_rate': 0.0024846338168982387,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 14,
'min_child_weight': 2,
'colsample_bytree': 0.7185159768996707},
'XGBoost/T14': {'n_estimators': 996,
'learning_rate': 0.00021969449275369667,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 7,
'min_child_weight': 4,
'colsample_bytree': 0.8265700178989688},
'XGBoost/T15': {'n_estimators': 184,
'learning_rate': 0.0005338528208876634,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 14,
'min_child_weight': 3,
'colsample_bytree': 0.6577141754620919},
'XGBoost/T16': {'n_estimators': 231,
'learning_rate': 0.008277487569782084,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 20,
'min_child_weight': 5,
'colsample_bytree': 0.8171370289786675},
'XGBoost/T17': {'n_estimators': 327,
'learning_rate': 0.001457809106726706,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 19,
'min_child_weight': 5,
'colsample_bytree': 0.9976497838389438},
'XGBoost/T18': {'n_estimators': 405,
'learning_rate': 0.00047425861609385705,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 10,
'min_child_weight': 1,
'colsample_bytree': 0.8117550505659341},
'XGBoost/T19': {'n_estimators': 981,
'learning_rate': 0.007927606898335429,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 20,
'min_child_weight': 1,
'colsample_bytree': 0.8891727410129545},
'XGBoost/T20': {'n_estimators': 710,
'learning_rate': 0.00015566709318481196,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 18,
'min_child_weight': 4,
'colsample_bytree': 0.918972453749402},
'XGBoost/T21': {'n_estimators': 483,
'learning_rate': 0.00012924965347548212,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 10,
'min_child_weight': 2,
'colsample_bytree': 0.7548121883599501},
'XGBoost/T22': {'n_estimators': 230,
'learning_rate': 0.0003678018633768408,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 15,
'min_child_weight': 4,
'colsample_bytree': 0.5195938961271603},
'XGBoost/T23': {'n_estimators': 694,
'learning_rate': 0.005767024262051199,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 19,
'min_child_weight': 4,
'colsample_bytree': 0.8443305914028851},
'XGBoost/T24': {'n_estimators': 882,
'learning_rate': 0.005372855029844569,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 7,
'min_child_weight': 4,
'colsample_bytree': 0.7825944333024377},
'XGBoost/T25': {'n_estimators': 972,
'learning_rate': 0.0014187079131473233,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 15,
'min_child_weight': 4,
'colsample_bytree': 0.5469702553792208},
'XGBoost/T26': {'n_estimators': 906,
'learning_rate': 0.0001834829545620072,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 5,
'min_child_weight': 1,
'colsample_bytree': 0.8337051899818408},
'XGBoost/T27': {'n_estimators': 588,
'learning_rate': 0.0014894497971271957,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 13,
'min_child_weight': 5,
'colsample_bytree': 0.5915956810035584},
'XGBoost/T28': {'n_estimators': 194,
'learning_rate': 0.00019628337888526193,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 7,
'min_child_weight': 4,
'colsample_bytree': 0.68490404637417},
'XGBoost/T29': {'n_estimators': 119,
'learning_rate': 0.00840189191344223,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 13,
'min_child_weight': 1,
'colsample_bytree': 0.8675970110612974},
'XGBoost/T30': {'n_estimators': 423,
'learning_rate': 0.0061955059017691355,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 13,
'min_child_weight': 5,
'colsample_bytree': 0.6980491377116811},
'XGBoost/T31': {'n_estimators': 849,
'learning_rate': 0.0007838821177219162,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 9,
'min_child_weight': 4,
'colsample_bytree': 0.9763745057584925},
'XGBoost/T32': {'n_estimators': 796,
'learning_rate': 0.0034201677465851405,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 14,
'min_child_weight': 4,
'colsample_bytree': 0.5581009548061973},
'XGBoost/T33': {'n_estimators': 564,
'learning_rate': 0.0004230938025800305,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 19,
'min_child_weight': 2,
'colsample_bytree': 0.624898137827489},
'XGBoost/T34': {'n_estimators': 773,
'learning_rate': 0.0010061176409589712,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 18,
'min_child_weight': 4,
'colsample_bytree': 0.8626271399098202},
'XGBoost/T35': {'n_estimators': 426,
'learning_rate': 0.0017282920378644158,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 16,
'min_child_weight': 2,
'colsample_bytree': 0.6974346467251146},
'XGBoost/T36': {'n_estimators': 691,
'learning_rate': 0.002090966500355389,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 13,
'min_child_weight': 1,
'colsample_bytree': 0.6507874083372747},
'XGBoost/T37': {'n_estimators': 266,
'learning_rate': 0.005713528269762262,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 20,
'min_child_weight': 4,
'colsample_bytree': 0.8745849185763623},
'XGBoost/T38': {'n_estimators': 316,
'learning_rate': 0.0015196568166072372,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 20,
'min_child_weight': 5,
'colsample_bytree': 0.7849824553506324},
'XGBoost/T39': {'n_estimators': 334,
'learning_rate': 0.0007291835638201714,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 20,
'min_child_weight': 4,
'colsample_bytree': 0.8260516350008444},
'XGBoost/T40': {'n_estimators': 132,
'learning_rate': 0.009291471149541942,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 19,
'min_child_weight': 4,
'colsample_bytree': 0.548629963531587},
'XGBoost/T41': {'n_estimators': 521,
'learning_rate': 0.0005065511695366727,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 11,
'min_child_weight': 4,
'colsample_bytree': 0.5497845445554055},
'XGBoost/T42': {'n_estimators': 317,
'learning_rate': 0.005448188369110731,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 7,
'min_child_weight': 4,
'colsample_bytree': 0.5747241523289969},
'XGBoost/T43': {'n_estimators': 647,
'learning_rate': 0.004966111413793925,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 9,
'min_child_weight': 4,
'colsample_bytree': 0.5619099914247208},
'XGBoost/T44': {'n_estimators': 914,
'learning_rate': 0.009228899563258623,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 12,
'min_child_weight': 4,
'colsample_bytree': 0.5166111926330886},
'XGBoost/T45': {'n_estimators': 494,
'learning_rate': 0.005404613626296209,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 20,
'min_child_weight': 2,
'colsample_bytree': 0.861027799735174},
'XGBoost/T46': {'n_estimators': 144,
'learning_rate': 0.003387063961742069,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 18,
'min_child_weight': 2,
'colsample_bytree': 0.6153711677033978},
'XGBoost/T47': {'n_estimators': 867,
'learning_rate': 0.0009530109129483169,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 10,
'min_child_weight': 1,
'colsample_bytree': 0.6697019097247593},
'XGBoost/T48': {'n_estimators': 546,
'learning_rate': 0.003867189079923127,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 10,
'min_child_weight': 1,
'colsample_bytree': 0.509260897230307},
'XGBoost/T49': {'n_estimators': 476,
'learning_rate': 0.00011579181437349269,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 5,
'min_child_weight': 4,
'colsample_bytree': 0.8522072009617664},
'XGBoost/T50': {'n_estimators': 423,
'learning_rate': 0.002501474899662669,
'n_jobs': -1,
'proc.max_category_levels': 100,
'objective': 'reg:squarederror',
'booster': 'gbtree',
'max_depth': 5,
'min_child_weight': 2,
'colsample_bytree': 0.5251850283787719},
'WeightedEnsemble_L2': {'use_orig_features': False,
'max_base_models': 25,
'max_base_models_per_type': 5,
'save_bag_folds': True}},
'leaderboard': model score_val pred_time_val fit_time \
0 WeightedEnsemble_L2 -34.898059 5.002062 111.771389
1 LightGBM/T47 -35.924928 0.042139 1.587203
2 LightGBM/T2 -35.931911 0.077495 2.277196
3 LightGBM/T7 -36.169756 0.039169 0.964131
4 LightGBM/T8 -36.190762 0.139982 1.775807
.. ... ... ... ...
146 XGBoost/T15 -243.843603 0.068631 2.060071
147 XGBoost/T22 -247.572636 0.142093 2.837742
148 XGBoost/T21 -251.406485 0.198672 4.017972
149 XGBoost/T49 -254.337702 0.041091 1.164598
150 XGBoost/T28 -257.438227 0.080407 0.987807
pred_time_val_marginal fit_time_marginal stack_level can_infer \
0 0.000491 0.481708 2 True
1 0.042139 1.587203 1 True
2 0.077495 2.277196 1 True
3 0.039169 0.964131 1 True
4 0.139982 1.775807 1 True
.. ... ... ... ...
146 0.068631 2.060071 1 True
147 0.142093 2.837742 1 True
148 0.198672 4.017972 1 True
149 0.041091 1.164598 1 True
150 0.080407 0.987807 1 True
fit_order
0 151
1 47
2 2
3 7
4 8
.. ...
146 115
147 122
148 121
149 149
150 128
[151 rows x 9 columns]}
predictor_new_hpo1.leaderboard(silent=True).plot(kind="bar", x="model", y="score_val")
<Axes: xlabel='model'>
predictions = predictor_new_hpo1.predict(test)
# Remember to set all negative values to zero
predictions[predictions<0].sum()
-46.586395
predictions[predictions<0]=0
# Same submitting predictions
submission_new_hpo1 =submission.copy()
submission_new_hpo1["count"] = predictions
submission_new_hpo1.to_csv("submission_new_hpo1.csv", index=False)
!kaggle competitions submit -c bike-sharing-demand -f submission_new_hpo1.csv -m "new features with hyperparameters1"
100% 188k/188k [00:02<00:00, 86.1kB/s] Successfully submitted to Bike Sharing Demand
!kaggle competitions submissions -c bike-sharing-demand | tail -n +1 | head -n 6
fileName date description status publicScore privateScore --------------------------- ------------------- ---------------------------------- -------- ----------- ------------ submission_new_hpo1.csv 2023-05-26 16:46:12 new features with hyperparameters1 complete 0.49301 0.49301 submission_new_hpo.csv 2023-05-26 16:15:07 new features with hyperparameters complete 0.46765 0.46765 submission_new_features.csv 2023-05-26 16:04:20 new features complete 0.65044 0.65044 submission.csv 2023-05-26 15:53:07 first raw submission complete 1.80414 1.80414
import pandas as pd
# Taking the top model score from each training run and creating a line plot to show improvement
# You can create these in the notebook and save them to PNG or use some other tool (e.g. google sheets, excel)
fig = pd.DataFrame(
{
"model": ["initial", "add_features", "hpo","hpo1"],
"score": [52.711828, 30.357975, 34.824982,34.898059]
}
).plot(x="model", y="score", figsize=(8, 6)).get_figure()
fig.savefig('model_train_score.png')
# Take the 3 kaggle scores and creating a line plot to show improvement
fig = pd.DataFrame(
{
"test_eval": ["initial", "add_features", "hpo","hpo2"],
"score": [1.80414, 0.65044, 0.46765,0.49301]
}
).plot(x="test_eval", y="score", figsize=(8, 6)).get_figure()
fig.savefig('model_test_score.png')
# The 3 hyperparameters we tuned with the kaggle score as the result
hp_df = pd.DataFrame({
"model": ["initial_model", "add_features_model", "hpo_model","hpo1_model"],
"hpo1": ['default_values', 'default_values', 'GBM: gbm_options','GBM'],
"hpo2": ['default_values', 'default_values', 'rf: rf_options','RF'],
"hpo3": ['default_values', 'default_values', 'XGB: xgb_options','NN'],
"hpo3": ['default_values', 'default_values', 'NN: nn_options','XGB'],
"score": [1.80414, 0.65044, 0.46765,0.49301]
})
hp_df.head()
| model | hpo1 | hpo2 | hpo3 | score | |
|---|---|---|---|---|---|
| 0 | initial_model | default_values | default_values | default_values | 1.80414 |
| 1 | add_features_model | default_values | default_values | default_values | 0.65044 |
| 2 | hpo_model | GBM: gbm_options | rf: rf_options | NN: nn_options | 0.46765 |
| 3 | hpo1_model | GBM | RF | XGB | 0.49301 |